We provide IT Staff Augmentation Services!

Sr. Etl Data Engineer Resume

0/5 (Submit Your Rating)

SUMMARY

  • Around 7 years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart Data Warehousing applications using ETL tools like Informatica Power Center 10.2/9.5/9.1/8.6/7.1 , Data Modeling and Reporting tools.
  • Extensive experience in developing the Workflows, Worklets, Sessions, Mappings, and configuring the Informatica Server using Informatica Power Center.
  • 2 - year hands-on expertise with Cloud, including Informatica ICS and IICS, Amazon AWS and Salesforce.
  • Worked on AWS EMR, Redshift and Lambda for cloud-based data solutions.
  • Experience in using AWS cloud components and connectors to make to make API calls for accessing data from cloud storage (Amazon S3, Redshift)
  • Experienced in developing various data extracts and loading routine using Informatica, Oracle stored procedures.
  • Strong experience in Data Warehousing (ETL & OLAP) environment and acquired excellent analytical, coordination, interpersonal skills, have immense leadership potential.
  • Strong knowledge of Dimensional Modeling, Star and Snowflake schema. Designed Fact and Dimension Tables as per the reporting requirements and ease of future enhancements.
  • Expertise in Data Flow Diagrams, Process Models.
  • Adept at understanding Agile software development methodologies and framework.
  • Extensive work experience in ETL processes consisting of data sourcing, data transformation, mapping and loading of data from multiple source systems into Data Warehouse using Informatica Power Center 10.2/9.5/9.1/8.6/7.1 .
  • Good Understanding of relational database management systems, experience in integrating data from various data source like MS SQL Server 2005/2008/2012 , Flat files and XML into staging area.
  • Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server.
  • Extensively used Repository Manager, Designer, workflow manager, workflow monitor Client tools of Informatica.
  • Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts and Data warehouse using Informatica Power Center.
  • Used Debugger in Informatica Power Center Designer to check the errors in mapping.
  • Experience includes working with healthcare, Insurance and Financial organizations.
  • Experience in working with Oracle, SQL databases. Experience working on Hadoop.
  • Highly proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User defined Functions, User profiles, Relational Database Models and Data Integrity, SQL joins, indexing and Query Writing.
  • Experience in implementing Slowly Changing dimension methodology (SCD) for accessing the full history of accounts and transaction information.
  • Experience in implementing complex business rules by creating transformation, reusable transformations (Expression, Aggregator, Filter, Connected and Lookup, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets, Mappings, and Triggers.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 10.2/9.6.1/9.5 , AWS SERVICES, GCP, Informatica Cloud ICS and IICS/ BDM, DataStage, Talend, Azure Data Warehouse

Cloud Base: AWS Glue, DMS, Quick Sight, S3, Redshift, CloudWatch, GCP, HDFS, YARN, Hadoop Eco system, Kafka, Real-time Processing

Language: Python, C, SQL, PL/SQL, XML, UNIX Shell Scripting

Database: Oracle 11g/10g, SQL Server, DB2, Teradata V2R5, Netezza, Teradata, PostgreSQL, GCP cloud

Operating Systems: Microsoft Windows 7/Vista/XP/2000/NT, UNIX

Other Tools: Tableau, IBM Cognos, PLSQL Developer, Putty, TOAD, SQL Developer, Erwin, Microsoft Visio, Splunk. Control-M, AutoSys, Tidal, Tivoli, Visual Studio

PROFESSIONAL EXPERIENCE

Confidential

Sr. ETL Data Engineer

Responsibilities:

  • Working on migrating health care data safely from legacy systems to a new infrastructure.
  • Analyzing the technical requirements and data flows and prepared technical strategy documents.
  • Measuring the performance benchmarking and fine-tuned existing Informatica Mappings, Sessions, and SQL for better performance by eliminating various performance bottlenecks.
  • Worked on creation of new tables and new column additions to existing target and source tables in ETL as per new business requirements.
  • Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services
  • Installed Informatica Cloud secure agent and set up connections to databases like Oracle, SQL Server and cloud like Salesforce, to run Informatica Cloud mappings.
  • Developed Informatica Cloud Data Integration mapping and task flows to extract and load data between on-premises, AWS RDS, Amazon S3, Redshift, Azure SQL Data Warehouse and Azure Data Lake Store; created and configured all kinds of cloud connections and runtime environments with Informatica IICS
  • Designed several Processes on Informatica Cloud and exposed them as RESTful API services to publish data to external systems.
  • Involved in Data Modeling, E/R diagrams, normalization and de-normalization as per business requirements.
  • Developed Complex mappings to load data to different target entities like cloud, Oracle, XML and Flat files.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Expertise in parsing and handing structured and unstructured data like Json, No-SQL.
  • Created several API calls to the external systems hosted by the vendors to get the needful data
  • Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.
  • Used Debugger to test the mappings and fixed the bugs.
  • Developed mapping parameters and variables to support SQL override.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • Involved in performance tuning and optimization of mapping to manage very large volume of data
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Did performance tuning to improve Data Extraction, Data process and Load time.
  • Extensively worked on batch framework to run all Informatica job scheduling.
  • Designed presentations based on the test cases and obtained UAT signoffs
  • Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments
  • Collaborated with BI and BO teams to observe how reports are affected by a change to the corporate data model.
  • Scheduled the jobs using Tidal.
  • Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica Cloud, AWS, S3, Oracle, Python, Informatica Power Center 10.2, SQL Server, Tivoli, Flat Files.

Confidential

ETL/Informatica Developer

Responsibilities:

  • Designed, developed, implemented and maintained Informatica PowerCenter 9.6.1 application for implementation of Business requirements.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Imported various heterogeneous files using Informatica Power Center 9.x Source Analyzer.
  • Prepared Technical Design documents and Test cases.
  • Used scheduling tools to create new jobs and job dependencies are setup with different Autosys cycles.
  • Experienced in writing SQL queries for retrieving information from the database based on the requirement.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
  • Strong experience in designing and developing mappings to extract data from different sources including flat files and XML.
  • Worked with the offshore team and supervised on their development activity.
  • Reviewed code and confirmed it was compatible to standard programming practice.
  • Conducted Knowledge transfer sessions about the project to the team and managed the project by providing reviews on it.
  • Implemented various Performance Tuning techniques.
  • Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.

Environment: Informatica PowerCenter 9.6, SQL Server, Oracle 11g, Super Putty, Toad, Teradata, UNIX.

Confidential

ETL/Informatica Developer

Responsibilities:

  • Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML, Source Qualifier and Stored procedure transformations.
  • Created interactive visualizations and dashboards using Tableau that enabled business users and executives to explore product usage and customer trends.
  • Developed mappings/mapplets by using Mapping designer, Transformation developer and Mapplet designer in Informatica PowerCenter
  • Handled Slowly Changing Dimensions (SCD) (Type I, Type II and Type III) based on the business requirements.
  • Designed and developed Informatica mappings for data sharing between interfaces utilizing SCD type 2 and CDC methodologies.
  • Used Informatica PowerCenter and all features extensively in migrating data from OLTP to Enterprise Data warehouse. Used Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into Data Warehouse (DWH).
  • Worked on FACETS Data tables and created audit reports using queries and Re-engineering and capturing of transactions with legacy systems.
  • Involved in creation of Folders, Users, Repositories and Deployment Groups using Repository Manager.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and transformations.
  • Performed Unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
  • Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).

Environment: Informatica PowerCenter 9.1/8.6.1, Oracle 11g, SQL Server 2005, SSIS 2005, UDB DB2 8.1, XML, Autosys, TOAD, SQL, PL/SQL, UNIX

Confidential

Software developer

Responsibilities:

  • Created mappings using transformations such as the Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator, Update Strategy etc.
  • Developed, managed and tested backup and recovery plans.
  • Studied data sources by interviewing users.
  • Wrote database scripts for user management and roles.
  • Created, monitored and maintained Oracle databases.
  • Extracted data from various data sources such as, Flat files, DB2 10.5 and transformed and loaded into targets using Informatica.
  • Used various performance tuning techniques to improve the session performance (Partitioning etc.).
  • Used Informatica client tools - Designer, Workflow Manager, and Workflow Monitor.
  • Generated completion messages and status reports using Workflow manager.

Environment: Informatica PowerCenter 8.6, Visual Basic 6.0i, PL/SQL, Oracle 9.x/10g, Flat Files, UNIX, MS SQL Server 2008, SQL, PL/SQL, and SQL PLUS.

We'd love your feedback!