We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

4.00/5 (Submit Your Rating)

Background Summary:

  • 5 years of IT Experience in analysis, design, development, testing, deployment and support of Data warehouse applications in Retail, Insurance and Health Industries.
  • Extensive experience in interacting with business users for requirement analysis and to define business and functional specifications.
  • Extensive experience in building Data Marts, Enterprise Data Warehouses (EDW), Operational Data Store (ODS), using Ralph Kimball and Bill Inmon methodologies.
  • Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.
  • Profound knowledge about the architecture of the Teradata database.
  • Strong experience in developing BTEQ script to Load data marts from Enterprise data warehouse using Volatile tables, Temporary stage and global temporary tables.
  • Demonstrated expertise in utilizing Informatica 9.x/8.x/7.x ETL tool suite for developing Data warehouses.
  • Worked on integration of various data sources like Oracle, SQL Server, DB2, MS Excel, Flat Files, XML files and Mainframe files.
  • Involved in designing and implementing patterns that will be used to manage various ETL activities, such as: changed data capture, process auditing, error handling, etc.
  • Excellent experience in identifying performance bottlenecks and tuning the Informatica Loads at various levels for better performance and efficiency.
  • Extensive experience with database languages such as SQL and PL/SQL which includes writing triggers, Stored Procedures, Functions, Views and Cursors.
  • Extensive experience in creating and executing Integration test plans.
  • Collaborated with business clients to perform User Acceptance Testing.
  • Experience in using the Informatica command line utilities like pmcmd to control workflows in non-windows environments
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.
  • Worked with business clients/reporting team to help them create Ad-Hoc and Canned Reports.
  • Held On-Call rotation and Production Support.
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, testing, training, implementation and post-implementation review.
  • Strong inter-personal skills, written and verbal communication, and facilitation skills.

Education

  • Bachelor of Technology in Electronics and Communications Engineering

Technical Skills:

Databases & Tools

Oracle 9x/10g/11g, Microsoft SQL Server 2008/2005, DB2 UDB 7.2/8.1, ADABAS,
RapidSQL,WINSQL,TOAD,SQL Developer

ETL Tools

Informatica 9.1.0/8.6/8.5.1/8.1.1/7.1.1, Informatica Power Exchange 8.x/9.x. DT Studio. Informatica Web Services

Teradata Tools & Utilities

Query Facilities: Query man, BTEQ
Load & Export: Fast Load, Mload, FExport, SQL Assistant, Teradata Administrator

Reporting Tools

Business Objects XI r2

Operating Systems

HP UNIX, Sun Solaris, IBM AIX, Windows XP

Data Modeling

ERWIN , ER Studio, Visio

Middleware

IBM MQ Series, JMS.

Miscellaneous

ITSM,JIRA,HP Quality Center,Putty,CuteFTP,Autosys

PROFESSIONAL EXPERIENCE:

Confidential,Madison, WI Sep’10 – Current
Sr. ETL Developer

Responsibilities:

  • Interacted with the Business users to identify and understand the system requirements and Conducted Impact and feasibility analysis.
  • Worked with Integration architect in developing the data extraction strategy from different operational systems into ODS/TRIAGE (staging) and then into EDW and Data marts (Analytical Data Layer).
  • Functional experience gained in billing, account receivables and insurance products such as personal/commercial auto, property and personal umbrella products.
  • Worked with Source system folks to build the ODS/ TRIAGE (staging) model for different subject areas.
  • Worked with data architect and data modeler in designing the data model for various subject areas in EDW (CSA) that would meet the data requirements.
  • Developed High Level Design Documents that lists the Data extraction, Data loading, Data Transformation and Error handing techniques. Also wrote Source to target mapping specifications.
  • Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
  • Created Scripts using Teradata utilities (Fast load, Multi load, Fast export)
  • Developed projects using Informatica B2B DT studio.
  • Used parser, serializer, mapper and other components along with unstructured data transformations.
  • Sourced data from multiple operational platforms- DB2, Oracle and MQ Series on near real time basis.
  • Created mappings using the Transformations like Source qualifier, Aggregator, Expression, Dynamic lookup, Router, Filter, Rank, Sequence Generator, Update Strategy and User Defined Transformation.
  • Created Mapplets, reusable transformations and used them in different mappings.
  • Development of shared libraries and creation of technical foundation for Informatica developers. Mentoring and teaching junior developers. Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements
  • Developed shell scripts for profiling and tracing back the mainframe data sets to actual source.
  • Expertise in Troubleshooting the problems encountered due to failure of jobs that uses the loaders MLOAD AND FASTLOAD
  • Created Global Temporary Tables which will be required to run some of the applications in bi-weekly, Monthly, Quarterly basis.
  • Involved in writing SQL scripts, Stored procedures for archival purposes and debugging them.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Worked with Scheduling team to set up the Autosys schedule for jobs that will load historical data for Financial Regulatory purposes.
  • Migrated repository objects and scripts from DEV to PRE-PROD environments. Extensive experience in troubleshooting and solving migration issues and production issues.
  • Provided On-Call Support

Environment: Informatica Power Center 8.5/9.1, Teradata 13, Teradata load utilities, Oracle 11g, DB2, Informatica Power Exchange 8.5/9.1.0, DT Studio, JMS, MQ Series, Informatica Web Services Hub, ER Studio, Win SQL, SQL Developer, Autosys

Confidential, KY May’09 – Sep’10
ETL Developer

Responsibilities:

  • Interacted with the Business Personnel to analyze the business requirements and transform the business requirements into the technical requirements.
  • Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables.
  • Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
  • Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter.
  • Extensive use of Persistent cache to reduce session processing time.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Modifying the shell/Perl scripts to rename and backup the extracts.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Wrote Stored Procedure using PL/SQL to archive 2 years worth of data on a rolling basis.
  • Implemented screen door process for cleaning flat files as per the business requirements.
  • Preparing ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Involved in Unit testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Maintaining issue log of the issues during the UAT phase for future reference
  • Preparing and using test data/cases to verify accuracy and completeness of ETL process.
  • Lead the team to recover from the Data warehouse corruption. Put the schedule and developed the plan to recover in a short time.
  • Actively involved in the production support and also transferred knowledge to the other team members.
  • Co-ordinate between different teams across circle and organization to resolve release related issues.

Environment: Informatica Power Center 8.1, Erwin, Autosys, Flatfiles, Oracle, AIX K-Shell scripts, ITSM, Quality Center

Confidential,Menomonee Falls, WI Nov’07 –May’09
ETL Developer

Responsibilities:

  • Analyzed the system for the functionality required as per the requirements and involved in the preparation of Functional specification document.
  • Performed extensive data analysis along with subject matter experts and identified source data and implemented data cleansing strategy.
  • Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
  • Worked with various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc.
  • Developed mapplets and worklets for reusability.
  • Implemented weekly error tracking and correction process using informatica.
  • Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
  • Created Scripts using Teradata utilities (Fast load, Multi load, Fast export).
  • Involved in creating secondary, join indexes for efficient access of data Based on the requirements
  • Created maestro schedules/jobs for automation of ETL load process.
  • Involved in performance tuning and optimization of Informatica mappings and sessions.
  • Involved in Unit testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Prepared test data/cases to verify accuracy and completeness of ETL process.

Environment: Informatica Power Center 7.1/8.1,PowerExchange 7.1, Teradata, IBM MQ Series, XML, XSD,Oracle 9i, DB2, SQL Server, Flat File Delimited , HP - Unix, Erwin 7

We'd love your feedback!