We provide IT Staff Augmentation Services!

Etl Developer Resume

5.00/5 (Submit Your Rating)

St Louis, MO

SUMMARY

  • Overall 9 years of IT professional experience with involvement in Analysis, Design and Development of different ETL applications and using Business Intelligence Solutions in Data Warehousing and Reporting with different databases.
  • Expertise in Full Life Cycle development of Data Warehousing.
  • Good knowledge on Normalization Concepts.
  • AWS is the best cloud platform for storage and security purposes.
  • Sourced Data from various source systems like Oracle, SQL Server, DB2, Flat files, Access, Netezza databases with the help of ODBC and native connections.
  • Experience in complete Software Development Life cycle (SDLC) including Requirement, Analysis, estimations, Design, Construction, Unit and System Testing and implementation.
  • Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 and Type 4 for inserting and updating Target tables for maintaining the history.
  • Proficient in performance analysis, monitoring and SQL query tuning usingEXPLAIN PLAN,Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Worked extensively in preparing Unit test cases (UTC) documents, Design documents, for data validation and migration purposes.
  • Experience in implementing complex business rules by creating reusable transformations, and robust mappings/mapplets using different transformations like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Normalizer etc.
  • Extensive experience in integrating data from various Heterogeneous sources like Relational database (Oracle, SQL Server, Teradata, DB2), Flat Files (Fixed Width and Delimited), COBOL files, XML Files and Excel files into Data Warehouse and DataMart.
  • Experience in Database Design, Entity - Relationship modeling, Dimensional modeling like Star schema and Snowflake schema, Fact and Dimension tables.
  • Having solid Experience in informatica and Teradata combination process.
  • Strong experience using Teradata utilities like MLOAD, FLOAD, TPUMP, FASTEXPORT and TPT for improving Teradata target load performance. Have also created BTEQ scripts to load data into base tables.
  • Experience in Agile and Waterfall Methodologies.
  • Worked on Incremental logic for extracting incremental data or delta data for performing Incremental data loads using control tables, parameter files, mapping Parameters/variables.
  • Excellent verbal and written communication skills combined with interpersonal and conflict resolution skills and possess strong analytical skills.

TECHNICAL SKILLS

Technical Skills: ETL Tools

Hadoop Ecosystems: Hadoop, MapReduce, HDFS, Hbase, Zookeeper, Hive, Pig, Sqoop, Cassandra, Oozie, Flume.

Reporting Tools: Knowledge on Cognos v 10/8, Business Object v 7.x/6.x

Version control: GIT

Databases: Oracle12c/11g/10g, MS SQL Server 2014/2012/2008 , Teradata 14/13/12, PostgreSQL, DB2 UDB v8, Netezza.

Languages: T-SQL, PL/SQL, HTML, basic Unix Shell Script, C, JAVA, XML, Perl, Python.

Build Systems: Maven, Jira

DB Tools: Toad, SQL* Loader, SQL server management studio, SQL Developer

Scheduling Tools: Maestro, Autosys, Control-M

Desktop App: Microsoft Office Suite (Word, Excel, PowerPoint, Access, Outlook)

Operating Systems: Windows XP/NT, Windows Vista UNIX,LINUX - UBUNTU

PROFESSIONAL EXPERIENCE

Confidential, St. Louis, MO

ETL Developer

Responsibilities:

  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Designed, coded, tested, debugged and documented those programs.
  • Extracted data from source systems to a staging database running on Teradata using utilities like Multiload and Fast Load.
  • Investigated current system, created requirements for new user management application
  • Created application for migrating data between system/database using PHP, MYSQL
  • Attended weekly meetings and discussed about the future goals
  • Learned valuable communication and developing skills by working with other developers and administrators.
  • Experience in writing SQL, PL/SQL codes, stored procedures and packages.
  • Experience in full and partial pushdown optimization.
  • Worked on Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling using Erwin.
  • Used Services include assessing, architecting and broad management of Hadoop.
  • Involved in the Unit testing and System testing.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes. Error checking and testing of the ETL procedures and programs using Informatica session log.
  • Involved in fixing invalid Mappings, Unit and Integration Testing of Informatica Tasks, Workflows and the Target Data.
  • Created and scheduled Sessions, Jobs were scheduled in DAC based on demand, run on time and run only once.
  • Coordinated the project tasks with the team and ensured that the projects are completed on time.
  • Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.
  • Provided production support including error handling and validation of mappings.
  • Worked on Autosys Scheduling Tool for Scheduling and Monitoring of jobs.
  • Used various Informatica Error handling techniques to debug failed session.
  • Used AWS in networking and storage process.

Environment: Informatica PowerCenter/ IDQ 9.5.1,SQL Server 2012, Teradata 13, Oracle 12, Rally (Agile-Scrum process), Siebel, Oracle DAC,Netezza 7.x, Linux/Unix shell scripting.

Confidential

Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Worked on applications which were both isolated and had upstream and downstream flows.
  • Extensively worked on Data Masking to mask data in QA/DEV/SIT environments.
  • Identified and Fixed Bugs in existing production mappings after performing a detailed impact Analysis.
  • Designed and developed Mappings, Mapplets, created mappings in pipeline structure
  • Used Target Load Plan in order to execute the mappings sequentially in a pipeline structure.
  • Used mapping parameters and variables.
  • Extracted data from XML files and loaded Teradata.
  • Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE datatype to store XML files.
  • Used support of the Teradata ecosystem to Hadoop most specifically in the areas of data access, data movement manageability, supportability and serviceability.
  • Used Teradata allows for the deepest Hadoop integration available.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Used Session level parameters such as connection strings.
  • Used MDM in the area of dealing with deploying, securing, monitoring, integrating.
  • Worked on Teradata MultiLoad, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
  • Partitioned Sessions for concurrent loading of data into the target tables.
  • UsedSQL Assistantto querying Teradata tables.
  • Wrote numerousBTEQ scriptsto run complex queries on theTeradata database.
  • Used Constraint Based Loading, partitioning, performance tuning on existing mapping
  • Developed workflows with sequential and parallel sessions
  • Created Work let’s in order to implement reusable logic composed of multiple sessions.

Environment: Informatica powercenter 9.1, UNIX, Oracle 11g, SQL Developer, HP Quality Center, Cognos, Teradata Sql assistant, IBM Tivoli.

Confidential, Boston, MA

ETL Developer

Responsibilities:

  • Primarily responsible to convert business requirements into system design.
  • Communicate with the Business Analyst and Data Analyst to fully understand requirements, provide feedback, and request clarification, as needed.
  • Worked with various teams doing development, training and documentation at various stages.
  • Review and interpret requirements, data and data models.
  • Used Informatica Designer tools to design the source definition, target definition and transformations to build mappings.
  • Customized to client’s business with extensive experience on architecture and best practices.
  • Conduct ETL design reviews with the Business Analyst, Data Analyst and DBA to ensure that code meets performance requirements and organization's data standards.
  • Developed the code per tech design by creating mappings using the transformations such as Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Extensively used ETL to load data from sources such as flat files, salesforce, XML to Oracle
  • Extracted data from XML files and loaded Teradata.
  • Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE datatype to store XML files.
  • Monitored Workflows and Sessions using Workflow Monitor
  • Used Debugger in troubleshooting the mappings.
  • Have used AWS in database, analytics and deployment of the code.
  • Perform DIT testing, coordinate with testing team for SIT and UAT in terms of help create test scenarios, prepare mock up data, run jobs and address defects.
  • Make code ready for implementation following configuration and change management processes
  • Load the project data elements into the Project Data Central (an internal data elements and Metadata repository).
  • Work with Production control team for job scheduling, application hand over and warranty.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Perform production check out, communicate to the project team, and help business validate the data.
  • Developed PL/SQL procedures for processing business logic in the database and use them as a Stored Procedure Transformation.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced inUNIXwork environment,file transfers,job schedulingand error handling.
  • Developedshell scriptsfor Daily and weekly Loads.
  • Prepare documentation for all mappings and workflows developed/modified as part of the project.

Environment: Informatica power center 8.5/9.1, Agile, UNIX, Oracle 11g, SQL Developer, HP Quality Center and Cognos.

Confidential, Pleasanton, CA

Informatica Consultant

Responsibilities:

  • Interaction with Business and Operations teams for converting the business requirements into proper technical solutions.
  • Analysis of data feeds and understanding data relationships.
  • Work closely with Domain SMEs and tech leads to get the right data and fix data issues.
  • Extraction, Transformation and Loading of the data using Informatica power center.
  • Design the target load process based on the requirement and design documents.
  • Enhancing the existing mappings per tech design.
  • Develop Mappings and Workflows to Import source definitions, load the data into Oracle tables, and generate staging files.
  • Unit Testing and debugging the Enhanced mappings.
  • Develop various transformations like Source Qualifier, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Recovering the failed Sessions and Batches.
  • Data analysis, validation and testing.
  • Developed PL/SQL procedures for processing business logic in the database and use them as a Stored Procedure Transformation.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced inUNIXwork environment,file transfers,job schedulingand error handling.
  • Developedshell scriptsfor Daily and weekly Loads.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Preparing the documents for test data loading and help testing team for SIT testing.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Ensure metadata repository is accurate and updated regularly.

Environment: Informatica Power Center 8.1/8.6, Oracle 11g, SQL, TOAD, UNIX Flat Files, Windows XP, Agile, IBM Mainframe.

Confidential

ETL Developer

Responsibilities:

  • Mainly involved in ETL developing.
  • Analyzed the Sources, targets, transformed the data and loading the data into Target Database using Informatica.
  • Transformations like Expression, Router, Joiner, Lookup, Update Strategy, and Sequence Generator are mainly used.
  • Design, develop, and testInformaticamappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
  • Design and developPL/SQLpackages, stored procedure, implement best practices to maintain optimal performance.
  • Created the reusable Mapplets using Mapplets designer and that Mapplets are used in mapping.
  • Extracted data from oracle, DB2, flat file and xml sources.
  • Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, SCD
  • Designed and developed UNIX shell scripts as part of the ETL process.
  • Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Supported and Fixed ETL bugs, reporting bugs identified by QA Team using Agile Methodology.
  • Performed the unit testing on the mappings developed according to the Business Scenarios.
  • Prepared the Documentation for the mappings according to the Business Requirements.

Environment: Informatica Power Center 7.1/8.6, Oracle 10g,Windows, flat files, UNIX, SQL Developer, PL/SQL, DB2.

We'd love your feedback!