We provide IT Staff Augmentation Services!

Sr.etl Developer Resume

4.00/5 (Submit Your Rating)

St Louis, MO

SUMMARY:

  • 10 years of IT experience in all phases of Data warehouse life cycle involving Analysis, Design, Development, Coding, Testing and Production Support of Business Intelligence Projects.
  • Strong knowledge in data warehouse concepts, Fact table, dimension table, star and snowflakes schema methodologies and data modeling.
  • Extensive experience in Data Warehousing projects in Collecting Client requirements,Data modeling and Development experience in ETL tool informatica 9.x/8.x/7.x/6.x designer,Workflow manager, Workflow monitor, Repository manager
  • Have extensive knowledge in BI Technologies like Informatica Power Center 9.x/8.x/7.x/6.x, Teradata V2R5/V2R6/12/13/14 and Oracle 12c/11g/10g/9i/8i.
  • Expertise in working with SQL queries and good in PL/SQL .
  • Expertise in Enterprise Data Warehouse, Data Integration, Data Marts and Data Migration projects.
  • Familiarity with entity - relationship/multidimensional modeling (star schema, snowflake schema).
  • Experience in development and design of RDBMS -OLTP, dimensional modeling using data modeling tool ERWIN, MS Visio .
  • Have domain knowledge in Health care, Retail, Insurance, Telecom, Wireless, Manufacturing, Financial .
  • Extensive Experience working with business users, SMEs, App DBAs, ETL admins, Prod support ppl as well as senior management
  • Experience in integration of data source like Teradata, Oracle, ODI,Flat Files, Mainframes,Sales Force and XML
  • Developed ETL Strategies using an ideal mix of Database based Loading Strategies and Informatica.
  • Experienced in Data governance of data profiling, linage, completeness, Cleansing, Standards, accuracy, quality and validation of data .
  • Experience with ABC data model for data control.
  • Involved in the full development lifecycle (SDLC) from requirements gathering through development and support using Informatica Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Developed Complex mappings from varied transformation logics like Unconnected/Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy, SQL, TC,XML and many more.
  • Proficient in performance tuning on informatica components as well as on database end.
  • Worked on Slowly Changing Dimensions (SCD) Type 1, 2, 3 to keep track of historical data.
  • Good in writing Unix Shell Scripts.
  • Worked on Water fall model / Agile Scrum methodologies in all projects.
  • Experienced in creating jobs for data load using scheduling and tools Control-M and Redwood Cronacle
  • Created high level and low level design documents and provide solutions for the requirements.
  • Involved in code reviews, architectural reviews and peer reviews of project and maintained documentation as per client standards.
  • Excellent communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high adaptability to new technologies and tools.
  • Experience with coordinating and leading onsite-offshore development
  • Strong Team working spirit and relationship management skills.

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 9.x/8.x/7.x/6.x, OLAP, OLTP

Data Modeling: Erwin 4.0/3.x, MS Visio

Databases: Oracle 12c/11g/10g/9i/8i/7.x, Teradata v2r5/v2r6/12/13/14,MS SQL Server 2012/, Main frames

Business Intelligence: Cognos 7/8/10, Cognos Report Net 1.1, Business Objects 5.0

Languages: SQL, PL/SQL

Load Utilities: MLoad, Fast Export, BTEQ, Fast load, SQL Loader

Tools: Toad 9.1, QC, Control-M, Cronacle, SQL Developer, Data Quality, Perfaware

Version Control: SVN, Tortoise CVS, DELL Share Point

Operating System: Windows 2000/2003/XP/7/8/10, UNIX

PROFESSIONAL EXPERIENCE:

Confidential,St.Louis,MO

Sr.ETL Developer

Responsibilities:
  • Coordinate with Business Analysts and Data Stewardship team to understand business requirements.
  • Designed data flow for the project and participated in Data model reviews.
  • Team Lead and interacting with Business Analyst to understand the business requirements.Involved in analyzing requirements to refine transformations.
  • Analysis of the specifications provided by the clients.
  • Preparation of HLD, project plan based on the business functional spec.
  • Review of ETL Detailed design documents for Mapping and DDL specification document for creation of tables, defining keys/constraints on tables and data types.
  • Analysis of Data model to check table constraints and columns in Data mart
  • Extracting data from sources like Oracle, Mainframe Db2 and Flat Files using Power center designer and power exchange and transforming them using the business logic and loading the data to the target warehouse.
  • Designing mappings as per the business requirementsUsing Transformations such as Source Qualifier, Aggregator,Expression, Lookup, Filter, Sequence generator, Router, Union, Update strategy etc.
  • Coordination of system/Integration/UAT testing with other teams involved in project and review of test strategy
  • Involved in dimensional logical model with 10 facts, 30 dimensions with 500 attributes.
  • Worked with DBA to create the physical model and tables.
  • Responsible to set up DEV environment as per project needs.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS .
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Reviewed SQL for missing joins & constraints, data format issues, miss-matched aliases, casting errors.
  • Mainly focused on data comparison & Anomalies between legacy to BI environment.
  • Responsible for data analysis, data validations, RCA for deviations.
  • Responsible for Design, Data Mapping Analysis, Mapping rules .
  • Generated reports /scripts using Teradata BTEQ and Teradata Stored procedures .
  • Responsible for Reviewing test cases in Perfaware tool.
  • Responsible of raising Defects and Defect Prevention Activities.
  • Responsible for generating weekly dashboard with sales & marketing metrics .
  • Responsible for development of Teradata- Mload, Fload, and Bteq scripts to load the data in to EDW .
  • Responsible for Metadata of Analysis, Data linage, Consitancy, completeness, validation and accuracy .
  • Responsible for maintenance of Metadata respository as per data governance standards.
  • Incorporated Reusable objects for data validation to ensure data quality.
  • Incorporated ABC data model for controlling data flow in EDW.
  • Helping the team in fixing the technical issues if any and Tuning of Database queries for better performance.
  • Keep tracking of all CI list, maintaining versions and Change requests.

Environment: Informatica 9.6,ERWIN, Oracle, Teradata 14.0, Mainframes, Teradata SQL Assistant, Microstarategy, Unix Shell scripting,Control-M, Perfaware

Confidential, Richardson,TX

DWH Specialist

Responsibilities:
  • Involved in Project planning, DEV Effort estimation.
  • Designed data flow for the project and participated in Data model reviews.
  • Created ETL Spec documents for ETL mappings development.
  • Analyzing and Profiling data from various data sources and legacy systems into TERADATA warehouse.
  • Responsible for resolving recurring incidents permanently, performing break fixes .
  • Analyze and interpret all complex data on all target systems
  • Responsible for root cause analysis of the production issues
  • Creating and modifying MULTI LOADS for INFORMATICA using UNIX and Loading data into EDW.
  • Tracking end to end Release cycle for break fixes coding, testing, migration, deployment & support.
  • Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles and display packages.
  • Data Cleansing / Standardization activities are performed on source data.
  • Responsible for fixes in mappings, sessions, workflows.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Involved with Data Stewardship Team for designing, documenting and configuring Informatica Data Quality environment for management of data.
  • Responsible for testing Teradata Objects in Preprod and BKP environments as per project needs.
  • Responsible for development of Teradata- Mload, Fload, and Bteq scripts to load the data in to EDW .
  • Incorporated Reusable objects for data validation to ensure data quality.
  • Incorporated ABC data model for controlling data flow in EDW.
  • Responsible for Reconciliation and fall out process.
  • Automated D ata warehouse and DataMart refreshes using Maestro .
  • Improved performance of the sessions by creating partitions .
  • Responsible for resolving recurring incidents permanently, performing break fixes .
  • Extensively worked with the Debugger for handling the data errors .
  • Defect Analysis and Defect Prevention Activities.
  • Worked on Performance tuning in order improve load timings.
  • Worked on UNIX scripts to move files across different environments.
  • Responsible for ON Demand loads as per business user request.
  • Responsible for loading data till package layer to help reporting folks.
  • Monitoring various loads like Daily, Weekly, halfrly, yearly Loads using Incremental Loading strategy.
  • Responsible for migrating the code between environments and Involved in UAT Testing.
  • Coordinate with all support teams like PST, Migration,QA, DBAs and Infa Admins during deployments.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements.

Environment: Informatica 9.6,ERWIN, SQL Developer 4.0.3.16, Oracle, Teradata 14.0, Microstarategy, Unix Shell scripting, Tivoli Workload Scheduler

Confidential,Philadelphia,PA

Sr. ETL Developer

Responsibilities:
  • Coordinate with Business Analysts to understand business requirements.
  • Involved in Project planning, Effort estimation.
  • Participated in Data model reviews and DA Forms .
  • Created ETL Spec documents for ETL mappings development.
  • Responsible for POC implementation and share the data results with BSA and End Users.
  • Followed Agile Methodology in the project with Stories, weekly Sprints and Scrum calls.
  • Prepared SRS and TDS documents and publish the same for architectural review.
  • Set up DEV environment in order to in corporate the project needs.
  • Tracking end to end development cycle from requirements gathering, high level design, detailed design, coding, testing, migration, deployment and production support.
  • Created parameter files in order to pull data from Oracle Source environment.
  • Responsible for creation of stored procedures to load hierarchies from Oracle environment.
  • Incorporated ETL logic to process the input records only once in a day.
  • Responsible for creating Informatica mappings, sessions, workflows.
  • Responsible for creating Teradata Objects in DEV, QA environments as per project needs.
  • Worked on Teradata- Mload, Fload, TPump and Bteq scripts to load the data in to TD database.
  • Incorporated Reusable objects for data validation to ensure data quality.
  • Implemented Slowly Changing Dimensions - Type2 for data loading.
  • Improved performance of the sessions by creating partitions .
  • Implemented full push down optimization while loading data to final base table.
  • Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
  • Worked on Performance tuning in order improve load timings.
  • Worked on UNIX scripts to move files across different environments.
  • Responsible for loading data till package layer to help reporting folks.
  • Built Unit Test Cases while testing data in DEV environment.
  • Responsible for Peer Review, Architectural review of the project to get sign off in dev. environment.
  • Guided QA team to build test cases and create queries to systematically proceed with testing.
  • Responsible for migrating the code between environments and Involved in SIT & UAT Testing
  • Automated load process by creating job chains using Control-M
  • Responsible for production deployment activities.
  • Coordinate with all support teams like PST, Migration, DBAs and Informatica Administrators during production deployment.
  • Performed Root cause analysis for pre and post production issues and provided shadow support.
  • Prepared SRS, TDS, MD 120, Code review, Peer review documents.

Environment: Informatica 9.5, ERWIN, Oracle 12c/11g, TOAD, Teradata 14, Cognos, Unix Shell scripting, Erwin, Quality center, Control-M

Confidential

Sr. ETL Developer / Team lead

Responsibilities:
  • Interacted with business analysts, data architects to develop ETL flow.
  • Responsible for collecting, managing and documenting the Requirements.
  • Prepared Design and Development plan for the project.
  • Responsible for SFDC data connectivity with informatica environment.
  • Worked with flat file connections and directories.
  • Used UNIX shell scripts for pre-staging activities.
  • Responsible for pre requisites of development activities
  • Setup the development and QA environment
  • Extracted data from Oracle, SFDC, flat files and transformed data using Source qualifier, Expression, Sequence Generator, Sorter and Router transformations.
  • Involved in Performance tuning at mapping and session level.
  • Built CDC logic on data warehouse tables.
  • Developed mapplets, worklets for data validation.
  • Implemented global tieouts for data validation.
  • Extensively involved in creating Teradata BTEQ scripts in UNIX environment.
  • Involved in Performance tuning at mapping and session level.
  • Implemented global tieouts for data validation.
  • Used shell scripts for pre-staging activities.
  • Responsible for writing/documenting the unit test cases .
  • Adhere to the quality code standards described by the client.
  • Worked on UNIX scripts to troubleshoot QA and production issues.
  • Automated daily load process using Control-M.
  • Responsible for deployment activities.
  • Facilitating KT sessions to team for knowledge sharing.
  • Timely status monitoring/ risk tracking to manager.

Environment: Informatica 9.6 / 9.1, Oracle 11g, TOAD, Cognos, Teradata 13, Erwin, Unix Shell scripting, Control-M, Quality Center, Dell Share point.

Confidential

ETL Developer/ Team lead

Responsibilities:
  • Actively participating in agile process development process.
  • Prepared technical Specification document and Migration document.
  • Coordinate with onsite coordinator in understanding requirements.
  • Sourced data from Oracle, flat file environment.
  • Worked on Extraction, Transformation and Loading of data using Informatica.
  • Develop Informatica Mappings, Sessions and Workflows.
  • Wrote TD Bteq scripts to move data between stages, intermediate, Inc, base, package tables.
  • Prepared test plans and test scripts.
  • Involved in code migration and production implementations.
  • Worked on reporting views creation.
  • Responsible for scheduling and tracking of work assigned to the team members.
  • Ensure project and process quality metrics achieved.
  • Prepared SRS, TDS, MD 120, Code review, Peer review documents.
  • Maintained all the documents in clear case with Version control in CVS.
  • Prepared migration, job chains, production support documents .
  • Automated load process using Redwood Cronacle.

Environment: Informatica 8.6, Oracle 9i/10g, TOAD, Teradata, FTP, SFTP, Cronacle, BO, Cognos

Confidential

Responsibilities:
  • Interacted with business users in solving issues in existing application and improvements.
  • Monitored production jobs and taken care to complete within SLA limits.
  • Worked on support tickets.
  • Impact and Root cause analysis
  • Involved in upgrade activities
  • Defect Analysis and Defect Prevention Activities.
  • Facilitating KT sessions for the team for knowledge sharing.
  • Estimated impact on enhancement requests.
  • Debug issues on Congos reporting environment.
  • Generate adhoc reports on timely basis as per client request.
  • Timely status monitoring/ risk tracking and reporting to manager.
  • Using project schedule and work breakdown structure to guide team in the implementation of the project.

Environment: Informatica 8.1/8.6, Oracle, SQL, PLSQL, Teradata, UNIX, FTP, Cronacle, Cognos, Cognos report net.

Confidential

Responsibilities:
  • Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
  • Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
  • Worked on enhancements of different modules of EDHR
  • Estimated impact analysis and root cause analysis for issue fix.
  • Developed and maintained stored procedures, functions using PL/SQL.
  • Responsible to load reporting views .
  • Involved in Integration Testing .
  • Informatica Code Migration to different environments and Production.
  • Defect Analysis and Defect Prevention Activities.
  • Deployment and post release support.
  • Performance Improvement of respective application and Knowledge Management.
  • Worked on cube environment of different modules.
  • Generated standalone, drill through reports.

Environment: Informatica 7.1/8.1, Oracle, SQL, PLSQL, Teradata, FTP, Red wood, Cognos, Cognos report net, Cognos Cube

We'd love your feedback!