We provide IT Staff Augmentation Services!

Data Engineer Resume

4.00/5 (Submit Your Rating)

Santa Monica, CA

SUMMARY

  • Around 8 Years of professional experience in designing and implementing Data Warehouse applications using Informatica Power Center 9.x/8.x/7.x in different business domains like finance, banking, Insurance and health care industries.
  • Involved in different phases of Data Warehouse Life Cycle including business requirements gathering, source system analysis, ETL design/development, project deployment, production support, maintenance of client/server Data warehouse and Data mart systems.
  • Extensively worked with various components of the Informatica Power Center tools - Power Center Designer, Repository Manager, Workflow Manager, and Workflow Monitor to create mappings for the extraction of data from various source systems.
  • Worked on various types of transformations like Lookup, Update Strategy, Stored Procedure, Joiner, Filter, Aggregator, Rank, Router, Normalizer, Sorter, External Procedure, Sequence Generator, and Source qualifier.
  • Experience in upgrading from Informatica Power Center 9.1 to Informatica Power Center 9.5.
  • Experience in monitoring the environments and suggesting improvements for stability and performance tuning using Database Tuning, Partitioning, Index Usage, and Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
  • Worked on integration of multiple data sources and targets like Oracle, SQL Server, flat files, COBOL files and XML.
  • Good understanding indatabase, data warehousing concepts (OLTP & OLAP) and code management tools.
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current, historical data in the dimension.
  • Implemented Data Cleansing, Data Analysis, Data Profiling, Transformation Scripts, Stored Procedures/Triggers and necessary Test plans to ensure the successful execution of the data loading processes.
  • Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
  • Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Good knowledge in Normalizing and De-normalizing the tables and maintaining Referential Integrity by using Triggers, Primary and Foreign Keys.
  • Experience in Dimension Data modeling concepts like Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
  • Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse and also wrote UNIX shell scripts to automate manual tasks.
  • Experience in Scrum, Agile and Waterfall models.
  • Full Software Development Life Cycle (SDLC) experience including Analysis, Design and Review of Business and Software Requirement Specifications.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.x/8.x, Power Exchange 9.1/8.1, Talend Big data

Languages: SQL, PL/SQL, C, C++, XML, HTML, Visual Basic 6.0

Operating Systems: UNIX, Windows XP/2007

Tools : PL/SQL, SQL, Developer 2000, Oracle Developer Suite, SQL Plus, Toad 8.x/9.x/10.x, SQL *Loader, Multi-Load, Teradata 12/13

Databases: Oracle 11g/10g, SQL Server 2000, Teradata 14/12, DB2

Job Scheduling: Autosys

BI Tools: Tableau

PROFESSIONAL EXPERIENCE

Confidential, Santa Monica, CA

Data Engineer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
  • Working on Agile Methodology. Needs to report our daily activity to scrum master.
  • Working on biweekly Confidential.
  • We have JIRA to track all our activities like design, testing and migration.
  • Working on Hive database using Talend Big data platform. Migrating user information from Data Lake to Salesforce Marketing Cloud using Talend Jobs.
  • Generated tsv file for user migration using talend jobs and provided to external vendor.
  • External vendor will load user data into sales force data extension.
  • Once migration is done, perform Unit testing between Source data and Sales force Marketing Cloud.
  • Working on Real time order management to pull the real time data for the reporting purpose using Informatica Power Center.
  • Schedule the end to end job using Informatica scheduler for every 2 hours.
  • Extracted data from various centers with the data in different systems like Oracle Database and SQL Server and loaded the data into Oracle staging using Informatica Power Center 9.5.
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, SQL transformation, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Creating the tickets for bugs and issues in JIRA and assign it to related teams.
  • Working with offshore teams to discuss the daily activity.

Environment: Informatica Power Center 9.6, Informatica Power Exchange 9.5, MySQL, Oracle EBS, Oracle 11g, UNIX, PL/SQL, SQL, Unix Shell Scripts, TOAD 10.1.1, Putty, Erwin 8.0, SQL *Loader, GIT, JIRA, Tableau, Hadoop, Talend, Sales force Data Extension

Confidential, Santa Monica, CA

Sr.Informatica Developer/Production Support Engineer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
  • Working on Agile Methodology. Needs to report our daily activity to scrum master. We have one week Confidential.
  • We have JIRA to track all our activities like design, testing and migration.
  • Created jobs to load the data from csv files to oracle staging using Informatica Power Center 9.6.
  • Files are coming from external source and loading them into oracle database.
  • Working on data integration, using various sources of data and combining them and loading into Oracle data warehouse system.
  • Worked on data conversion for different systems.
  • Completed unit testing after data conversion.
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Working with Data migration Team to migrate Orders from one system (Bydesign) to another system (Oracle EBS).
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Migrating code from Dev to QA, QA to UAT and UAT to PROD.
  • Maintaining the version control and release notes in GIT.
  • Developed mapping variables and parameters to support SQL override.
  • Involved in Production Support activity.
  • Creating the tickets for bugs and issues in JIRA and assign it to related teams.
  • Extracting the user data from EBS and loading into Data Ware house.
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Performance tuning of the mappings and sessions.
  • Schedule the workflows using Informatica Scheduler.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Used ER Studio to analyze and optimize database and data warehouse structure.
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects.
  • Basic Knowledge of Talend, Hadoop and Tableau. Hands on experienced on Tableau to create the reports.

Environment: Informatica Power Center 9.6, MySQL, Oracle EBS, Oracle 11g, UNIX, PL/SQL, SQL, Unix Shell Scripts, TOAD 10.1.1, Putty, Erwin 8.0, SQL *Loader, GIT, JIRA, Tableau, Talend

Confidential, Atlanta

Sr.Informatica Developer

Responsibilities:

  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Translate requirements and high-level design into detailed functional design specifications.
  • Extracted data from various centers with the data in different systems like My SQL, SQL Server and loaded the data into Oracle staging using Informatica Power Center 9.5.
  • Involved in Migration from Informatica Power Center 9.1 to Informatica Power Center 9.5
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension.
  • Developed complex Informatica mappings to implement Change Data Capture (SCD)mechanism by using Type-2 effective date and time logic.
  • Created critical re-usable transformations, mapplets and work lets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data.
  • Involved in debugging Informaticamappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Developed Oracle PL/SQL Package, procedure, function and trigger.
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects.

Environment: Informatica Power Center 9.5/9.1, Oracle 11g, UNIX,PL/SQL, SQL, MySQL, Unix Shell Scripts, TOAD 10.1.1, Putty, Erwin 8.0, SQL *Loader, SQL, PL/SQL.

Confidential, Arizona

Informatica Developer

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.
  • Developed ETL mappings, transformations using Informatica Power Center 9.1
  • Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions (SCD) and also performed reading and loading high-volume Type 2 dimensions.
  • Extensively used Informatica debugger to figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs.
  • Implemented Incremental loading of mappings using Mapping Variables and Parameter Files.
  • Experienced in designing and developing Informatica IDQ environment.
  • Used Mapping Parameters and Mapping Variables based on business rules provided.
  • Wrote PL/SQL Procedures for data extractions, transformation and loading.
  • Assisted in Data Modeling and Dimensional Data Modeling.
  • Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Scheduled workflow daily basis for incremental data loading.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Developed MLOAD scripts to load data to Teradata Data mart.
  • Accomplished data movement process that load data from databases, files into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as FASTLOAD, FASTEXPORT, MULTILOAD.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Scheduling jobs using Autosys to automate the Informatica Sessions.
  • Provided Production Support at the end of every release.

Environment: Informatica Power Center 9.1, Oracle 10g, Sql Server2000/2008, UNIX, COBOL, ERWIN 3.5, Shell script, Rapid-SQL, Toad, Teradata 12, Visio, Autosys.

Confidential, Hartford, CT

Informatica Developer

Responsibilities:

  • Used Informatica Power Center 8.6 for migrating data from various OLTP databases and other applications to the Radar Store Data Mart.
  • Worked with different sources like Relational, Mainframe (COBOL), XML, flat files (CSV) loaded the data into Oracle staging.
  • Created complex Informatica mappings with extensive use of Aggregator, Union, Filter, Router, Normalizer, Java, Joiner and Sequence generator transformations.
  • Created and used parameter files to perform different load processes using the same logic.
  • Extensively used PL/SQL for creation of stored procedures and worked with XML Targets.
  • Filtered Changed data using Power exchange CDC and loaded to the target.
  • Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.
  • Used different Tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait and Control) in the creation of workflows.
  • Utilized the new utility Informatica Data Quality (IDQ) that came up with Informatica Version 8.
  • Performed performance tuning of source level, target level, mappings and session.
  • Involved in modifying already existing UNIX scripts and used them to automate the scheduling process.
  • Coordinated with testing team to make testing team understandbusiness and transformation rules being used throughout ETL process.

Environment: Informatica Power Center 8.6, Informatica Exchange, Oracle 10g, XML, UNIX, PL/SQL, Windows 2000/XP.

We'd love your feedback!