We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

2.00/5 (Submit Your Rating)

Farmington Hills, MI

PROFESSIONAL SUMMARY:

  • Detail - oriented programmer, solutions-driven, quality minded ETL, Data Warehouse and Business Intelligence professional with 9+ years of advanced skill with leading-edge programming tools complemented by proven ability to assimilate and rapidly utilize emerging technologies.
  • Over 9+ years of success in architecture, design, development, implementation/support across leading-edge technologies for numerous Fortune 500 companies.
  • Experience working in Software Applications, Analysis, Design, Development, Testing, Maintenance and Enhancements in Data Warehousing and Data Integration.
  • Enhancements ofData warehouseandODS systemsusingInformatica Power Center 10.x/ 9.x/8.x/7.x/6.x,Oracle, DB2, SQL, PL/SQLand well versed with Data Warehousing Architecture, Technology, OperationalData StoreandDimensional Data Modeling.
  • Experience in designing and implementation of real-time Data warehouse using Informatica Power Center, SSIS, DataStage, Pentaho.
  • Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method worked Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Applied Address Doctor tool in IDQ Plans in order to obtain Gold Source Data / Master Data (MDM).
  • Extensive working experience in Dimensional Data Modeling, Star-Schema and Snowflake-Schema.
  • Worked extensively withFlat Files,Relational source data, andWorkflowandScheduling Job design,resolving production issues.
  • Knowledge on Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ) for Data Profiling.
  • Applied Address Doctor tool in IDQ Plans in order to obtain Gold Source Data / Master Data (MDM).
  • Used features likepre-session/post-session commands/SQL, fine-tuned databases,mappings, andsessionsto getoptimal performance.
  • Worked with Informatica Cloud for creating source and target object, developed source to target mappings.
  • Implementedslowly changing dimensionsType1,Type2andType3, slowly growingtargets, and simple pass through mapping's using Power Center.
  • Experience inUNIX Shell scripting.
  • Understanding of Relational (ROLAP), Multidimensional (MOLAP) modeling, data warehousing concepts, star and snowflake schema, database design methodologies and Meta data management
  • Extensive experience with relational databasesOracle 10g/11g, DB2,SQLServer, Amazon AWS Redshift.
  • Extensive experience in ERP systems and CRM systems
  • Experience in developingTest Strategy,Test Plan,Testcases,Use Case,and Test Scripts.
  • Working and worked in Agile methodology with sprints.
  • Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.
  • Experience working with Health Industry standards HIPAA, HL7, ANSI-834/837, NCPDP.
  • Possess excellentinteraction,communicationandproblem solvingabilities.
  • Excellent Analytical, Written and Communication skills
  • Strong experience inoffshoreandonsite coordination.

TECHNICAL SKILLS:

ETL Tools: Informatica Power center 9.6.1/8.x/7.x, Informatica Power Exchange 9.6.1/8.x, IDE, IDQ, MDM, SSIS, DataStage, Pentaho, SAP Business Object Data Integrator (BODI), SSIS, SSRS, Oracle Data Integrator (ODI)

Data Modeling Tools: ERWIN, Microsoft Visio Professional, SQL Developer

Reporting Tools: OBIEE, Cognos, Business Objects XIR2/XIR3, Crystal reporting 8.x/10.x, Qlikview, Micro strategy

Databases: Oracle 12/11/10/9i/8i, DB2, Teradata v 13/v12/v2r5, SQL Server SQL Server, MySQL,, Natezza, Hadoop, Redshift

Languages: UNIX shell scripting, SQL/PLSQL, COBOL Java, Python, C

DB Tools: SQL* plus, SQL* Loader, SQL*Forms, TOAD

Web Tools: HTML, XML, JavaScript, Servlets, EJB

Other Tools: VSTS, TFS, Jira, control M, Autosys, Toad, SCM.

OS: Windows NT/2000/2003/7, UNIX, Linux, AIX

PROFESSIONAL EXPERIENCE:

Confidential, Farmington Hills, MI

Sr ETL Developer

Responsibilities:

  • Worked with Business / Customers to get the Business Requirements and completed the Business Requirement (BRD) & Requirement Traceability Matrix (RTM) documents.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Worked on Informatica 9.6.1 Power center tools- Designer, Repository Manager, Workflow Manager and workflow monitor.
  • Used IDE / IDQ tool to load the data from file to stage tables.
  • Used Informatica partitions like the Database and Pass-through for performance tuning.
  • Involved in Reviews & Unit Testing of Mappings and Workflows.
  • Involved in creating mapplets and worklets.
  • Used UNIX shell scripts to FTP the source files from EDI servers using the config files.
  • Hands on working experience in building MDM composite services using Services Integration Framework (SIF) including setup and configuring SIF SDK
  • Worked on various tasks of workflows such as the Command, Event wait and event raise.
  • Created various parameter files and workflow variables for flexible run of the workflows.
  • Modified existing mappings for enhancements of new business requirements.
  • Extensively used Korn Shell Scripts for doing manipulations of the flat files, given by the share brokers.
  • Performed unit testing and created documents with the test cases using SQL scripts.
  • Expertise in Requirement Analysis, Design, Coding, and Testing & Implementation of ETL/DWH projects using Informatica Power Center 9.x/8.x/7.x, Informatica Power Exchange, Data Masking, Test Data Management, PL/SQL, COBOL,DB2and UNIX Shell Scripts
  • Working in Agile Methodology SDLC.
  • Used various transformations like the Source Qualifier, Joiner, Expression, Filter, Lookup, Sequence Generator and Router to develop complex mappings using Power Center Designer.
  • Worked on various types of flat files i.e. fixed width and delimited.
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities
  • Involve in UAT to get Client GoLive Approval.
  • Creating Deployment Groups and Harvest Packages to move the code to production.

Environment: Informatica 9.6.1, Oracle, Teradata, Power Exchange, Flat Files, Passport, Harvest, Windows XP, UNIX and Control -M

Confidential, Southfield, MI

Sr ETL Developer

Responsibilities:

  • Instrumental in Analysis, Requirements Gathering and documentation of Functional and Technical specifications.
  • Involved in Dimensional modeling to Design and develop STAR Schemas using ERwin to identify Fact and Dimension Tables.
  • Work extensively on Informatica client tools such as Designer, Workflow manager, Workflow Monitor.
  • Use ETL to load data from sources such as Flat Files, Oracle to Oracle, Teradata Target Database. Based on the business requirements Reusable transformations are created in transformation developer and Mapplets in the Mapplet designer.
  • Used Informatica Data Quality (IDQ) to profile the data and apply rules to & Provider subject areas to get Master Data Management (MDM).
  • Involved extensively on Linux scripting to calls a job in informatica.
  • Develop complex mappings to transform the data using Rank, Sorter, Stored Procedure, Joiner, Aggregator, Filter, Connected lookup, unconnected lookup and Router transformations.
  • Worked on multiple projects using Informatica developer tool IDQ
  • Involved in migration of the mapps from IDQ to power center.
  • Implement slowly changing dimensions type 2 to keep track of historical data.
  • Use dynamic lookup cache for slowly changing dimensions.
  • Used B2B data transformation for getting the data from unstructured sources (XML).
  • Use workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
  • Implement performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions.
  • Experince in BigDatacomponents like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Develop batch processes for financial reporting applications and modules using Perl and Kornshell scripts on Oracle database, with partitions and sub-partitions.
  • Use Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and resolved them. Perform unit testing and system testing.
  • Perform Database end tuning using Explain Plan and analyze table queries.

Environment: Informatica Power Center 9.6.1/8.6, Data Exchange (B2B), Oracle 10g, DB2, AutoSys, SQL Server 2008, PL/SQL, Teradata, Toad, ERwin 3.5, Unix

Confidential, Rochester, NY

ETL Informatica Developer

Responsibilities:

  • To offer hands on support for development and maintenance of the Hadoop Platform and various associated components for data ingestion, transformation and processing.Developed UNIX shell scripting, created command task, and email task for providing the pre-session post-session requirements for various informatica jobs.
  • Worked with Informatica PowerCenter client tools like Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Extracted data from Oracle, SQL Server database tables and flat files.
  • Worked in the performance tuning for mappings and ETL procedures both at mapping and session level.
  • Implementing performance tuning the objects to increase the Performance of the loads.
  • Worked closely with database administrators and application development team(s) on the design and implementation of the database.
  • Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements, etc.
  • Developed shell scripts in Unix to automate jobs in DEV and QA.
  • Understood of Software Development Life Cycle (SDLC), involved in various phases like Requirements, Analysis/Design, Development and Testing.
  • Design data and data quality rules using IDQ and involved in cleaning the data using IDQ.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes
  • Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer.
  • Worked with Slowly Changing Dimensions Type1, Type2 for Data Loads.
  • Handled various loads like Intra Day Loads, Daily Loads, Weekly Loads, Monthly Loads, and Quarterly Loads using Incremental Loading Technique.
  • Worked with Incremental Loading using Parameter Files, Mapping Variables and Mapping Parameters.
  • Involved in error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor.
  • To perform data ingestion using ETL tools, specifically sqoop Big Data Edition and Hadoop transformation (using MapReduce, Spark/Scala)
  • Worked with the Debugger for handling the data errors in the mapping designer.
  • Involved in bug fixes for production issues.

Environment: Informatica PowerCenter 9.5/8.x, Oracle 11g, DB2, SQL Server 2005, Flat Files, Control-M, PL/SQL, Rapid SQL, Quality Center, Unix, Linux

Confidential, Orlando, FL

Informatica Developer

Responsibilities:

  • Designed and developed ETL process using Informatica tool.
  • Worked with various Active transformations in Informatica PowerCenter like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation.
  • Worked with various Passive transformations in Informatica PowerCenter like Expression Transformation, Sequence Generator, and Lookup Transformation
  • Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager. Responsible for extracting data from Oracle and Flat files. Responsible for Performance Tuning in Informatica PowerCenter.
  • Worked with both Connected and Un-Connected Lookups.
  • Made use of sorted input option for the performance tuning of aggregator transformation.
  • Used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation.
  • Prepared Technical Design documents and Test cases. Implemented various Performance Tuning techniques.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing at various levels of the ETL and actively involved in team Peer reviews.

Environment: Informatica PowerCenter 9.5.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Oracle 10g, DB2, SQL, PL/SQL, Flat Files, Star Team

Confidential, New York, NY

ETL Developer

Responsibilities:

  • Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Wrote shell scripts for file transfers, file renaming and several other database scripts to be executed from Unix.
  • Troubleshot issues in Test and Prod. Did impact analysis and fixed the issues.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.

Environment: Informatica PowerCenter 8.6.1/7.x, Informatica Power Exchange 8.1, Stored procedures, Pl/SQL, Informatica Data Quality, Unix (Solaris), HP Quality Center

Confidential

Programmer Analyst

Responsibilities:

  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time.
  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Designed and developed data validation, load processes, test cases, error control routines, audit and log controls using PL/SQL, SQL.
  • Used Update strategy and Target load plans to load data into Type-2 /Type1 Dimensions.
  • Created and used reusable Mapplets and transformations using Informatica PowerCenter.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, testing of Informatica Sessions, and the Target Data.

Environment: Informatica PowerCenter 7.1.3, MS-SQL Server 2005, PL/SQL, Stored Procedures, Informatica Data Quality, Unix (Solaris), HP Quality Center

We'd love your feedback!