We provide IT Staff Augmentation Services!

Sr. Etl Specialist/architect Resume

3.00/5 (Submit Your Rating)

Mounds View, MN

SUMMARY:

  • Over 10+ years of IT experiences in Data Warehousing (ETL) Requirements Gathering, Analysis, Design, Development, Implementation, Integration, Testing, Profiling, Cleansing and Validation of data using the Technology, Informatica PowerCenter/Data Quality (IDQ) from versions 7.1 to 9.6.1HF1 for Health Care, Insurance, Banking and Wireless Industries in different Methodologies.
  • Experienced in Migration of codes from Repository to Repository, wrote up Technical/Functional Mapping specification Documents for each Mapping along with unit testing for future development.
  • Proficient in Designing the Automation Process of Workflows and configuring/scheduling the Workflows for load frequencies. Skilled to Develop, Test, Tune and Debug the Mappings, Sessions and Monitor the system.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit. Worked with Address Doctor, different algorithms, Biogram/Jaro /Edit/Hamming/Reverse distance in IDQ to prepare the MDM data.
  • Skilled to interact with business users. Pioneered in different load strategies from heterogeneous sources to target. Successfully implemented SCD Type1/Type2 load, Capture Data Changes to maintain the Data history.
  • Experienced to identify the Bottlenecks of data load and Tuned the Bottlenecks for better performance.
  • Extensive experiences to create Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using CA ERwin.
  • Experienced in Informatica Administration for installing and configuring Repository, Web Hub services for configuration of Domain/Gateway Services using the Administration Consol, Repository Manager tools.
  • Excellent to write the Stored Procedures, Triggers, Indexes, Functions by PL/SQL, SQL Scripts. Developed various reports, dash board using MicroStrategy reporting tools.
  • Experienced as Database Administration in Oracle 9i, Flashback, RMAN to Recover the data, Database Design, Enterprise level Backup/ Recovery procedure, Performance Tuning, Table Partitioning, Database Architecture, Monitoring.
  • Team Player, excellent communication & interpersonal skills, vivid analytical ability to solve the problems.

TECHNICAL SKILLS:

ETL/IDQ Tools: Informatica PowerCenter/Data Quality (IDQ)/MDM 9.6.1HF1 - 7.1, SSIS, SSRS

Database: Oracle 12r-8i, SQL-Server 2005/2008r, DB2, Teradata, MySQL, Hadoop, Netezza.

Reporting Tools: OBIEE, Business Object .

GUI Tools: SQL-Developer, TOAD 9.5, SQL*Plus, IIR, Web Services (WSDL), SOAP, JIRA, MDM, SAP, Putty, WinSCP, Salesforce, BTEQ, SAP HANA.

Languages: SQL, PL/SQL, Java, C, C++, C#, T-SQL, XML, Unix Shell Scripting, Perl, Python .

Operating Systems: Windows 93/95/98/ME/NT/XP/ 7/8, Vista, Unix, Mac.

Scheduling Tools: Tivoli, Control-M, UC-4, Autosys.

Modeling Tools: CA Erwin, Embercado, Power Designer.

PROFESSIONAL EXPERIENCE:

Sr. ETL Specialist/Architect

Confidential, Mounds View, MN

Responsibilities:

  • Worked with stakeholders for requirements gathering, analysis, design, development, testing for N-to-N solutions and successfully implemented the project. Guided and supervised six off-shore resources.
  • Designed logical and physical data model using Power Designer and created Business Object (BO) Universe along with DDLs for different types of reports in SAP HANA.
  • Arranged daily & weekly meetings to monitor the resources and updated the client about work progresses.
  • Based on business requirements wrote the Tech Specs for each process, designed ETL process as a model for each layers and provided the solutions in a mission critical situation wherever the resources stuck on.
  • Wrote all DDL scripts to create Tables, Views, Transaction Tables, Triggers, Store Procedures for base tables and Confidential processes in all layers. Designed Jobs by Unix Shell Scripts for Tivoli to schedule workflows. Wrote SOP/AID documents for smooth transfer of project.
  • Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Extensively worked on Confidential to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Debugged the invalid mappings and tasted Mappings, Sessions, and Workflows to figure out the Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
  • Peer-reviewed the code to cross-check if the logics are accurate or not to meet the business requirements and client standardization based on Tech Specs, fixed them if there are any discrepancies. Identify the feasible alternative approaches, systems and equipment reduce cost, improve efficiency while meeting the expectations.

Environment: - Informatica PowerCenter 9.6.1 HF2, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, Tivoli, Service Now, Power Designer, SAP-HANA, Business Object .

Sr. ETL Developer/ Specialist

Confidential, Jefferson City, MO

Responsibilities:

  • Participated in daily/weekly team meetings. Worked with the Business Analyst/Stakeholders to develop the Confidential .
  • Proposed new ETL processes for Confidential .
  • Wrote the DDD and TDD documentations along with different test cases for smooth transfer of project and to maintain the SDLC.
  • Parsed high-levels design specification of ETL coding and mapping standards. Developed new SCD Type1/Type2 complex mappings, fixed the old mappings into different layers and proposed strategies for future growth of the data.
  • Guided the other developers and provided the technical solutions in need, peer reviewed their codes to meet the accurate business requirements and project standardization. Verified the unit testing results.
  • Tested mappings, workflows and sessions to figure out the bottleneck to tune them for better performance. Prepared effect Unit, Integration and System test cases for various stages to capture the data discrepancies/ inaccuracies to ensure the successful execution of accurate data loading.
  • Extensively worked on Confidential to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Migrated the codes from Dev to Test and Test to Prod. Wrote the migration documentation in details for system compatibility, objects and parameter files to smooth transfer of code into different environments.
  • Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets (command, email, assignment, control, event wait/raise, conditional flows etc) and configured them according to business logics & requirements to load data from different Sources to Targets.
  • Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create the indexes and to solve the complex calculations on data.

Environment: Informatica PowerCenter 9.6.1 HF4, DB2, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, JIRA. XML, Control-M.

Sr. ETL Developer/ Data Analyst

Confidential, Los Angeles, CA

Responsibilities:

  • Supervised the data warehousing functions of data compilation, extraction, loading, cleansing and modeling.
  • In corporate the advanced technologies, application tools and analysis techniques available in the market within the organization. Validate the provided data for accuracy, authenticity and legality.
  • Participated in team meetings and proposed ETL strategy based on Agile Methodology.
  • Based on Subject Areas, provided concrete solutions for complex/critical Mappings. Successfully implemented SCD Type1/ Type 2 for insert, Confidential, and delete operation to maintain the data history. Created Mapping & Sessions Variables/Parameters, Parameters files, Mapplets to reuse during life cycle development.
  • Created batches based on Subject Areas for different layers to run Workflows/Worklets and Sessions, scheduled the Workflows for load frequencies and configured them to load data.
  • Involved in debugging the invalid Mappings. Tasted Mappings, Sessions, and Workflows to figure out the Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
  • Migrated the codes from Development to Test, Test to Production. Created effective Unit, System, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading. Created technical documentations for each Mapping for future developments.
  • Designed and coded change request as per the new requirements. Created Pre & Post-Sessions UNIX Scripts, Stored Procedures to drop & re-create the indexes and to solve the complex calculation.

Environment: Informatica PowerCenter 9.6.1 HF4, MS-SQL 2008, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, MS-SQL Server 2008, SQL-Developer, NoSQL.

Sr. Data Warehousing(ETL)/IDQ Developer- Informatica

Confidential, San Francisco, CA

Responsibilities:

  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Tested mappings, workflows and sessions to figure out the bottleneck to tune them for better performance. Prepared effect Unit, Integration and System test cases for various stages to capture the data discrepancies/ inaccuracies to ensure the successful execution of accurate data loading.
  • Analyzed the data based on requirements, wrote down the techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ) 9.5.1 Developer to remove the noises of data using Parser, Labeler, Standardization, Merge, Match, Case Conversion, Consolidation, Address Validation, Key Generator, Lookup, Decision etc Transformations and performed the unit testing for accuracy of MDM data.
  • Used different algorithms like Biogram, Edit, Jaro, Reverse and Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using IDQ Analyst tools for MDM data.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create the indexes and to solve the complex calculations on data. Responsible to transform and load of large sets of structured, semi-structured and unstructured data from heterogeneous sources.

Environment: Informatica PowerCenter 9.5.1 HF4, Informatica Data Quality (IDQ), MDM, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, MS-SQL Server 2008, SQL-Developer, JIRA.

Sr. ETL Developer-Informatica

Confidential, Beverly Hills, CA

Sr. ETL / IDQ Developer- Informatica

Confidential, Sacramento, CA

Sr. ETL / IDQ Developer- Informatica

Confidential, Eagan, MN

We'd love your feedback!