We provide IT Staff Augmentation Services!

Etl/ Informatica Developer Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • 8 years of hands - on development experience wif ETL tools (i.e Informatica), Oracle Databases, MS SQL Server 2012+, MySQL, Terdata, Informatica, SSIS, SSRS, Oracle 11g, Oracle Forms and Reports.
  • Extensive noledge on the Powercenter components as Powercenter Designer, Powercenter Repository Manager, Workflow Manager and Workflow Monitor.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Demonstrated experience wif design and implementation of Informatica (IDQ v9.1), Data Quality applications for the business and technology users across the entire full development life-cycle.
  • Thorough noledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
  • Experience in integrating business application wif Informatica MDM hub using Batch process, SIF and message queues.
  • Experienced in scheduling Sequence and parallel jobs usingDataStage Director, UNIX scriptsandscheduling tools.
  • Extensive ETL tool experience usingIBM Infosphere/Websphere DataStage to perform ETL & ELT oprations on data.
  • Experience inperformance tuningofInformatica mappingsandsessionsto improve performance of the large volume projects.
  • Understanding of the entire AWS Product and Service suite primarily EC2, S3, VPC,Redshift, Spectrum, EMR(Hadoop) and other monitoring service of products and their applicable use cases, best practices and implementation, and support considerations.
  • Strong in SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL .
  • Working noledge of data warehouse techniques and practices, experience inclusing ETL process, dimensional data modeling (Star Schema, Snow flake schema, Fact & Dimention tables), OLTP and OLAP.
  • Experience wif Data Cleansing, Data Profiling and Data analysis. UNIX Shell Scripting, perl Scripting, SQL and PL/SQL coding.
  • Involved in Testing, Test Plan Preparation and Process Improvement for the ETL developments wif good exposure to development, testing, debugging, implementation, documentation, user training & production support.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience in designing Data engineering pipelines to and from Snowflake, in batch/real-time modes.
  • Database / ETL Performance Tuning: Broad Experience in Database Development including effective use of Database objects, SQL Trace, Explain Plan, Different types of Optimizers, Hints, Indexes, Table Partitions, Sub Partitions, Materialized Views, Global Temporary tables, Autonomous Transitions, Bulk Binds, Capabilities of using Oracle Built-in Functions. Performance Tuning of Informatica Mapping and workflow.
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.
  • StrongData Modelingexperience usingER diagram, Dimensional data modeling, Star Schema modeling, Snow-flake modeling using tools like Erwin,EMBARCADERO ERStudio.
  • Worked a great extent wif the design and development of Tableau which includes Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
  • Worked directly wif non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • Proficient in Informaticaadministrationwork including installing and configuringInformatica PowerCenter and repository servers on Windows and UNIX platforms, backup and recovery, folder and user account maintenance.
  • Experience wif industry Software development methodologies like Waterfall, Agile Scrum methodology wifin the software development life cycle.
  • Involved in understanding client Requirements, Analysis of Functional specification, Technical specification preparation and Review of Technical Specification.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10.4/10.2/9.6/9.1/ , Power Connect, Power Exchange, Informatica PowerMart 6.2/5.1.2, Datastage 11.5, Informatica MDM 10.1/9.X, SQL*Loader, Flat Files (Fixed, CSV, Tilde Delimited), MS SQL Server Integration Services (SSIS).

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Erwin, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.

Scheduling Tool: Autosys, Control M, CA WORKSTATION, Tivoli Workload scheduler (TWS).

Reporting Tools: SSRS, Tableau

Database and related tools: Oracle 10g/9i/8i/8/7.x, MS SQL Server 2012/2008, Teradata, PL/SQL, Netezza, DB2, AQT

Languages: SQL, PL/SQL, SQL*Plus, Unix Shell Scripting, Java

Web Technologies: HTML, XHTML and XML

Operating Systems: Microsoft XP/NT/2000/98/95, UNIX

Cloud Technologies: AWS, Azure, Informatica Cloud

PROFESSIONAL EXPERIENCE

ETL/ Informatica Developer

Confidential, Chicago IL

Responsibilities:

  • Coordinated wif Business users for requirement gathering, business analysis to understand the business requirement and to prepare technical specification documents to code ETL mappings for new requirement changes.
  • Develop strategy for implementing data profiling, data quality, data cleansing and ETL metadata.
  • Analysis of source, requirement, existing OLTP system and identification of required dimensions, facts and factless from the database.
  • Analyzed large amounts of data wif limited documentation wif SME support to produce technical specifications and Data Quality improvement plans; migration & conversion requirements by analyzing source and target schema.
  • Used Informatica Power Center 10.2 for developing, dupport and maintenance for extraction, transformation and load (ETL) process to feed data into the data warehouse.
  • Used Informatica Data validation Option (DVO) to complete data testing quickly and easily by creating rules that test the data being transformed during the data integration process.
  • Involved in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema).
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into theORACLEdatabase.
  • Identified and fixed the bottle necks and tuned the mappings and sessions for improving performance. Tuned both ETL process as well as databases.
  • Monitoring the data, worked on automate jobs running and valdating the data.
  • Build a re-usable staging area in Oracle for loading data from multiple source systems using template tables for profiling and cleansing in IDQ or QualityStage.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the Loads.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Developed and implemented the unix scripts to run batches and JIL scrips for autosys automate jobs for the ETL mapping.
  • Developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Involved inAnalyzing/ buildingTeradata EDWusingTeradata ETL utilitiesandInformatica.
  • Extensively worked on database performance tuning techniques and modifying the complex join statements.
  • Used Snowflake functions to perform semi structures data parsing entirely wif SQL statements.
  • Interact wif business, gathering requirements, co-ordinate wif offshore team to deliver the code to client.
  • Provide on call 24/7 production support by identifying defects, documenting, resolving the issues and working wif Change Management, DBA and other teams to come up wif fixes.

Environment: Informatica power center 10/9.6, MDM, Netezza, Datastage 11.5, Teradata13/14, Datastgae 11.5, Snowflake, Teradata SQL Assistant, SQL, DB2, IDQ 9.6.1, Oracle 11/10g, PL/SQL, Autosys, UNIX, Shell scripting, HP Quality Center, AQT.

Informatica Developer

Confidential, Columbus,Ohio

Responsibilities:

  • DesignedETL specificationswif transformation rules usingETL best practicesfor good performance, maintainability of the code and efficient restart ability.
  • Developed informatica mappings using ETL Tools to load the data into Netezza and Eagle from various sources like oracle and flatfile using various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Utilized informatica IDQ 9.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Worked on design and development of informatica mappings, workflows to load data into staging area, datamarts in Oracle, Eagle and Netezza.
  • Tuned Performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
  • Scheduling the workflows and Loading the files and Monitoring the workflows very frequently to ensure data loading is done accordingly.
  • Coordinated wif QA People to follow up the issues on testing to make the code perfect wifout any issues before migrating to the production.
  • Performed data validation after the successfull End to End tests and appropriate error handling in ETL processes.
  • Developed UNIX scripts to move the source files to Archive Directory using CICS Explorer and run those in ESP scheduling tool
  • Experience in creating indexedviews, complex stored procedures, effective triggers, and useful functions to facilitate efficient data manipulation and consistent data storage.
  • Participated in Daily status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings. UseAgilemethodology for SDLC and utilize scrum meetings for creative and productive work.

Environment: Informatica power center 10.2/9.6, Informatica Data Quality 9.6.1, Netezza, Eagle, SQL, Oracle 11/10g, PL/SQL, UNIX, ESP, CA Workstation.

ETL Developer

Confidential, Greensboro, NC

Responsibilities:

  • Analyzed the business systems, gathered requirements from the users and documented business needs for decision support data.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Created the (ER) Entity Relationship diagrams & maintained corresponding documentation for corporate data dictionary wif all attributes, table names and constraints.
  • Designed, Developed, Deployed and implemented ETL mappings usingInformatica wif source definitions, target definitions to extract from flatfiles, relational sources.
  • Extensively used various transformations like Union, Expression, Filter, Aggregator, connected and unconnected look-up for implementing complex logic.
  • Responsible for loading data files from various external sources like ORACLE, MySQL into staging area in mySQL and vertica databases.
  • Migrated Workflows, Mappings, and other repository objects from Development to QA and then to production.
  • Extensively worked on using the PDO ( Push down optimization), CDC ( Change data capture) mechanism.
  • Extensively used Data Flow task like Derived Columns, conditional split, Aggregate, Execute package task. Implemented Slowly Changing Dimensions (SCD) TYPE 2 to maintain Historical data.
  • Identified the bottle necks and tuned the mappings and sessions to improve performance.
  • Created sessions and workflows for processing and to populate the dimensions and facts in the star schema.
  • Participated in the development and implementation of the MDM decommissioning project using Informatica PowerCenter that reduced the cost and time of implementation and development and usedSQL Assistantto queryingTeradata tables.
  • Publishing Data using Message Queues to notify external applications on data change in MDM Hub Base Objects.
  • Responsible for performance tuning at all levels of the Data warehouse.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Used the advanced features of PL/SQL like Records, Tables and Object types.
  • Responsible for writing SQL scripts and procedures drop and create partitions of large volume tables for the archival process.
  • Worked wif QA team on daily basis to follow up on the bugs and made sure the code is bug free before migrating to Production.

Environment: InformaticaPowerCenter 9.6, Informatica MDM, Teradata12, ETL, Shell scripting, Flat files, Oracle 11g, MS SQL Server 2012, PL/SQL, SQL * Loader, Excel, Unix, Control M and Erwin 7.2.

ETL Developer

Confidential

Responsibilities:

  • Based on the requirements created functional desing documents and technical design specification documents for ETL process.
  • Developed mapping and mapplets using informatica designer to load data into ODS from various transactional source systems.
  • Used informatica designer to import the sources, targets, create various transformations and mappings for extracting, transforming and loading operational data into the EDW from ODS.
  • Used various trnasformations such as expression, filter, rank, source qualifier, joiner, aggregator and normalizer in the mappings and applied surrogate keys on target table.
  • Investigate and resolve ETL issues and provide necessary communications regarding progress, performance, issues and risks.
  • Collaborates wif other teams to address upstream and downstream integration dependencies via services and SLA’s
  • Migrated mappings from development to testing and performed unit testing and integration testing.
  • Scheduling ETL jobs, Monitor ETL loads/ Informatica jobs availability and reliability.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows.
  • Created connection pools, physical tables, defined joins nad implemented authorizations in the physical layer of the repository.
  • Coordinated wif QA, development teams during System Integration and User Acceptance Testing to accurately resolve the issues.

Environment: Environment:Informatica Power Center 8.6, UNIX Shell Scripting, Oracle 9i, PL/SQL, Autosys.

We'd love your feedback!