We provide IT Staff Augmentation Services!

Sr. Etl Specialist/architect Resume

5.00/5 (Submit Your Rating)

Mounds View, MN

SUMMARY

  • Over 10+ years of IT experiences in Data Warehousing (ETL) Requirements Gathering, Analysis, Design, Development, Implementation, Integration, Testing, Profiling, Cleansing and Validation of data using teh Technology, Informatica PowerCenter/Data Quality (IDQ) from versions 7.1 to 9.6.1HF1 for Health Care, Insurance, Banking and Wireless Industries in different Methodologies.
  • Experienced in Migration of codes from Repository to Repository, wrote up Technical/Functional Mapping specification Documents for each Mapping along with unit testing for future development.
  • Proficient in Designing teh Automation Process of Workflows and configuring/scheduling teh Workflows for load frequencies. Skilled to Develop, Test, Tune and Debug teh Mappings, Sessions and Monitor teh system.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit. Worked with Address Doctor, different algorithms, Biogram/Jaro /Edit/Hamming/Reverse distance in IDQ to prepare teh MDM data.
  • Skilled to interact with business users. Pioneered in different load strategies from heterogeneous sources to target. Successfully implemented SCD Type1/Type2 load, Capture Data Changes to maintain teh Data history.
  • Experienced to identify teh Bottlenecks of data load and Tuned teh Bottlenecks for better performance.
  • Extensive experiences to create Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using CA ERwin.
  • Experienced in Informatica Administration for installing and configuring Repository, Web Hub services for configuration of Domain/Gateway Services using teh Administration Consol, Repository Manager tools.
  • Excellent to write teh Stored Procedures, Triggers, Indexes, Functions by PL/SQL, SQL Scripts. Developed various reports, dash board using MicroStrategy reporting tools.
  • Experienced as Database Administration in Oracle 9i, Flashback, RMAN to Recover teh data, Database Design, Enterprise level Backup/ Recovery procedure, Performance Tuning, Table Partitioning, Database Architecture, Monitoring.
  • Team Player, excellent communication & interpersonal skills, vivid analytical ability to solve teh problems.

TECHNICAL SKILLS

ETL/IDQ Tools: Informatica PowerCenter/Data Quality (IDQ)/MDM 9.6.1HF1 - 7.1, SSIS, SSRS

Database: Oracle 12r-8i, SQL-Server 2005/2008r, DB2, Teradata, MySQL, Hadoop, Netezza.

Reporting Tools: OBIEE, Business Object .

GUI Tools: SQL-Developer, TOAD 9.5, SQL*Plus, IIR, Web Services (WSDL), SOAP, JIRA, MDM, SAP, Putty, WinSCP, Salesforce, BTEQ, SAP HANA.

Languages: SQL, PL/SQL, Java, C, C++, C#, T-SQL, XML, Unix Shell Scripting, Perl, Python.

Operating Systems: Windows 93/95/98/ME/NT/XP/ 7/8, Vista, Unix, Mac.

Scheduling Tools: Tivoli, Control-M, UC-4, Autosys.

Modeling Tools: CA Erwin, Embercado, Power Designer.

PROFESSIONAL EXPERIENCE

Sr. ETL Specialist/Architect

Confidential, Mounds View, MN

Responsibilities:

  • Worked with stakeholders for requirements gathering, analysis, design, development, testing for N-to-N solutions and successfully implemented teh project. Guided and supervised six off-shore resources.
  • Designed logical and physical data model using Power Designer and created Business Object (BO) Universe along with DDLs for different types of reports in SAP HANA.
  • Arranged daily & weekly meetings to monitor teh resources and updated teh client about work progresses.
  • Based on business requirements wrote teh Tech Specs for each process, designed ETL process as a model for each layers and provided teh solutions in a mission critical situation wherever teh resources stuck on.
  • Wrote all DDL scripts to create Tables, Views, Transaction Tables, Triggers, Store Procedures for base tables and CDC processes in all layers. Designed Jobs by Unix Shell Scripts for Tivoli to schedule workflows. Wrote SOP/AID documents for smooth transfer of project.
  • Migrated codes from Dev to Test to Pre-Prod. Created TEMPeffective Unit, Integration test of data on different layers to capture teh data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Extensively worked on CDC to capture teh data changes into sources and for delta load. Used Debugger to validate teh Mappings and gained troubleshooting information about teh data and error conditions.
  • Debugged teh invalid mappings and tasted Mappings, Sessions, and Workflows to figure out teh Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
  • Peer-reviewed teh code to cross-check if teh logics are accurate or not to meet teh business requirements and client standardization based on Tech Specs, fixed them if their are any discrepancies. Identify teh feasible alternative approaches, systems and equipment reduce cost, improve efficiency while meeting teh expectations.

Environment: - Informatica PowerCenter 9.6.1 HF2, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, Tivoli, Service Now, Power Designer, SAP-HANA, Business Object .

Sr. ETL Developer/ Specialist

Confidential, Jefferson City, MO

Responsibilities:

  • Participated in daily/weekly team meetings. Worked with teh Business Analyst/Stakeholders to develop teh FDD. Proposed new ETL processes for DST (Determination Summery Tables).Wrote teh DDD and TDD documentations along with different test cases for smooth transfer of project and to maintain teh SDLC.
  • Parsed high-levels design specification of ETL coding and mapping standards. Developed new SCD Type1/Type2 complex mappings, fixed teh old mappings into different layers and proposed strategies for future growth of teh data.
  • Guided teh other developers and provided teh technical solutions in need, peer reviewed their codes to meet teh accurate business requirements and project standardization. Verified teh unit testing results.
  • Tested mappings, workflows and sessions to figure out teh bottleneck to tune them for better performance. Prepared TEMPeffect Unit, Integration and System test cases for various stages to capture teh data discrepancies/ inaccuracies to ensure teh successful execution of accurate data loading.
  • Extensively worked on CDC to capture teh data changes into sources and for delta load. Used Debugger to validate teh Mappings and gained troubleshooting information about teh data and error conditions.
  • Migrated teh codes from Dev to Test and Test to Prod. Wrote teh migration documentation in details for system compatibility, objects and parameter files to smooth transfer of code into different environments.
  • Designed teh automation process of Sessions, Workflows, scheduled teh Workflows, created Worklets (command, email, assignment, control, event wait/raise, conditional flows etc) and configured them according to business logics & requirements to load data from different Sources to Targets.
  • Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create teh indexes and to solve teh complex calculations on data. Environment:- Informatica PowerCenter 9.6.1 HF4, DB2, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, JIRA. XML, Control-M.

Sr. ETL Developer/ Data Analyst

Confidential, Los Angeles, CA

Responsibilities:

  • Supervised teh data warehousing functions of data compilation, extraction, loading, cleansing and modeling.
  • Incorporate teh advanced technologies, application tools and analysis techniques available in teh market within teh organization. Validate teh provided data for accuracy, authenticity and legality.
  • Participated in team meetings and proposed ETL strategy based on Agile Methodology.
  • Based on Subject Areas, provided concrete solutions for complex/critical Mappings. Successfully implemented SCD Type1/ Type 2 for insert, CDC, and delete operation to maintain teh data history. Created Mapping & Sessions Variables/Parameters, Parameters files, Mapplets to reuse during life cycle development.
  • Created batches based on Subject Areas for different layers to run Workflows/Worklets and Sessions, scheduled teh Workflows for load frequencies and configured them to load data.
  • Involved in debugging teh invalid Mappings. Tasted Mappings, Sessions, and Workflows to figure out teh Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
  • Migrated teh codes from Development to Test, Test to Production. Created TEMPeffective Unit, System, Integration test of data on different layers to capture teh data discrepancies/inaccuracies to ensure teh successful execution of accurate data loading. Created technical documentations for each Mapping for future developments.
  • Designed and coded change request as per teh new requirements. Created Pre & Post-Sessions UNIX Scripts, Stored Procedures to drop & re-create teh indexes and to solve teh complex calculation. Environment:- Informatica PowerCenter 9.6.1 HF4, MS-SQL 2008, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, MS-SQL Server 2008, SQL-Developer, NoSQL.

Sr. Data Warehousing(ETL)/IDQ Developer- Informatica

Confidential, San Francisco, CA

Responsibilities:

  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided teh technical solutions. Proposed ETL strategies based on requirements.
  • Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated teh codes from Dev to Test to Prod environment. Wrote down teh techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Tested mappings, workflows and sessions to figure out teh bottleneck to tune them for better performance. Prepared TEMPeffect Unit, Integration and System test cases for various stages to capture teh data discrepancies/ inaccuracies to ensure teh successful execution of accurate data loading.
  • Analyzed teh data based on requirements, wrote down teh techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ) 9.5.1 Developer to remove teh noises of data using Parser, Labeler, Standardization, Merge, Match, Case Conversion, Consolidation, Address Validation, Key Generator, Lookup, Decision etc Transformations and performed teh unit testing for accuracy of MDM data.
  • Used different algorithms like Biogram, Edit, Jaro, Reverse and Hamming Distance to determine teh threshold values to identify and eliminate teh duplicate datasets and to validate, profile and cleanse teh data. Created/modified reference tables for valid data using IDQ Analyst tools for MDM data.
  • Used teh Address Doctor to validate teh address and performed exception handling, reporting and monitoring teh system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed teh workflows as an application to run them. Tuned teh mappings for better performance.
  • Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create teh indexes and to solve teh complex calculations on data. Responsible to transform and load of large sets of structured, semi-structured and unstructured data from heterogeneous sources. Environment:- Informatica PowerCenter 9.5.1 HF4, Informatica Data Quality (IDQ), MDM, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, MS-SQL Server 2008, SQL-Developer, JIRA.

We'd love your feedback!