We provide IT Staff Augmentation Services!

Etl Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 8+ years of IT experience in ETL/DWBI Development, Implementation, Testing and Support.
  • 6+ years of experience in ETL development using Informatica (8.6.1/9.6), Microsoft SSIS and Pentaho Data Integration.
  • 2+ years of experience in ETL testing and BI report testing (Cognos).
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Knowledge in Apache Hadoop, Big Data Concept, Sqoop, Hive
  • In Depth knowledge of Data Warehousing applications in Telecom, Banking and Financial services.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Experience in participating Business Requirement, Functional Specification and Technical Specification documentation review calls.
  • Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center, SSIS, and Pentaho Data integration.
  • Proficiency in developing SQL with various relational databases like Oracle, SQL Server, Teradata
  • Experienced in integration and transforming of various data sources from Databases like MS Access, Oracle, SQL Server, PL/SQL and formats like flat - files.
  • Experience in Informatica Objects migration, execution and monitoring to run Test loads.
  • Experience in SSIS package migration, execution and monitoring to run Test loads.
  • Experience in Pentaho steps migration, execution and monitoring to run Test loads.
  • Expertise in scheduling Informatica jobs using Informatica, Tivoli maestro and with UNIX.
  • Experience in Composing, Submitting and monitoring IBM TWS schedules and jobs as part of Test load execution.
  • Experience in High level Documentation and test data preparation and Testing Methodologies and implementing in STLC Life cycle process.
  • Extensive experience in Black Box Testing, Backend Testing, ETL Testing, BI (Cognos, Micro Strategy) Testing and UAT Testing.
  • Excellent understanding of the Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Prior experience in Quality Assurance testing with Test Case Design, Test Data preparation, Test Case execution and Defect tracking.
  • Experience in Composing, Submitting and monitoring IBM TWS schedules and jobs as part of Test load execution.
  • Experience in developing Test plan and presenting to the line of business and team.
  • Experience in developing and writing test cases and process flows.
  • Involved extensively in Functional, Integration, Regression and Black Box, Re-testing, Smoke testing and Sanity testing.
  • Worked closely with Business users for UAT testing.
  • Experience in Data warehouse Testing (ETL Testing, BI Report Testing).
  • Experience in Test Management tools like Quality Center 9.2, Jira, and Bugzilla.

TECHNICAL SKILLS

Languages Known: UNIX Shell Scripting, PL/SQL

DBMS: Oracle, DB2, SQL Server, Teradata, HP Vertica

Operating Systems: Windows XP, LINUX and UNIX

ETL tools: Informatica Power Center 8.6.1/9.6, Pentaho Data Integration, SSIS

Scheduling tools: IBM Tivoli scheduler (TWS)

Management Tool: HP Quality Center, Excel, Word, Power Point, JIRA, Bugzilla.

Package: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word), Visual Studio

PROFESSIONAL EXPERIENCE

Confidential - Atlanta, GA

ETL Developer

Responsibilities:

  • Gathering of user requirements and source system analysis and establishing mappings between sources to target attributes.
  • Participated in the complete life cycle of the project, which involved understanding scope of the project, functionality, technical design and complete development.
  • Design, Develop, Analyze, coordinate application software and architecture using Oracle, Teradata And Teradata Utilities, Informatica Power center, UNIX Scripting, SQL, Golden Gate, and IBM Tivoli Workflow Scheduler (TWS).
  • Analyze & Develop extract, transform, load (ETL) processes for corporate data warehouses using Teradata utilities, Informatica, UNIX and Oracle.
  • Develop code in VERTICA to move data from existing Teradata environment to VERTICA.
  • Develop code to load data from new sources into Teradata, VERTICA.
  • Extracted data from Flat file, Oracle and SQL Server then used Teradata for data warehousing.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Defined Target Load Order Plan and to load data correctly into different Target Tables.
  • Moved the mappings, sessions, workflows, mapplet from one environment to other.
  • Worked on UNIX Shell scripting and called several shell scripts using command task in Workflow manager.
  • Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.
  • Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like BTEQ, FLOAD and MLOAD.

Confidential - Atlanta, GA

ETL Developer

Responsibilities:

  • Participated in the complete life cycle of the project, which involved understanding scope of the project, functionality, technical design and complete development.
  • Responsibility of Benefit Matching post conversion and re design the data mapping rules based on the Match results.
  • Analysis of Data models of the Legacy System and Target System to understand the usage of both the data and provide conversion solutions.
  • Involved in extraction of data from various sources like flat files, Oracle, SQL Server.
  • Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering Confidential Source Qualifier for relational databases and selecting the table with fewer rows as master table in joiner transformations.
  • Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like Oracle and Delimited Flat files using Informatica.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Used various transformations like Expression, Aggregator, Joiner, Filter, Router, Lookup, Union, Normalizer and Update Strategy in the Mappings.
  • Used partitions in mappings to improve performance in extracting huge data volume from database sources and in processing large volume of files.
  • Worked on TOAD and SQL Developer to prepare SQL Queries to validate the data in both source and target databases.
  • Extensively written SQL to confirm the data transfer from source to target.

Confidential

ETL Developer

Responsibilities:

  • Involved in the analysis of the user requirements and identifying the sources.
  • Involved in the preparation of High level design documents and Low level design documents.
  • Worked on Pentaho Data Integration and Pentaho Aggregation Designer.
  • Worked on SSIS designer to create ETL packages for DW/BI loads from multiple source systems
  • Involved in extraction of data from various legacy sources like flat files, Oracle, SQL Server and load to DW/BI platform
  • Created SSIS packages using SSIS designer to extract data from DW/BI for downstream extracts.
  • Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation
  • Parsed high - level design specification to simple ETL coding and mapping standards to outline data flow from sources to targets.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Developed transformations and jobs to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these jobs.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Used PDI Oracle Bulk loader step to load initial data from flat files to the database tables in Oracle.
  • Used Database Join, Database Lookup, Row Normalizer, Dimension Lookup/Update PDI Steps in ETL transformation
  • Created reusable mappings for Database Lookup and Sequences which is used in PDI transformations.
  • Modified existing PDI transformation and jobs for enhancements of new business requirements.
  • Wrote UNIX shell Scripts using PDI kitchen commands for command line execution of PDI jobs and schedule them in IBM Tivoli Workflow Scheduler.
  • Prepared migration document to move the jobs from development to testing and then to production repositories.

Confidential - Plano, TX

Senior Quality Assurance Analyst

Responsibilities:

  • Studied the business requirement and design document and developed system test cases specific to ETL process using HP Quality center.
  • Worked as quality analyst for different ETL tools informatica, SSIS.
  • Understanding functional aspect of project and worked on customer base to be used for testing and worked with various upstream systems for test data request.
  • Reviewed the test results executed by team.
  • Prepared weekly testing status report and conducted daily status meeting.
  • Provided estimates and timelines for system test phase.
  • Performed System Test Environment Setup and ETL Code Migration process.
  • Developed Power center Informatica mapping, PL/SQL queries & UNIX scripts to validate & compare the source and target table.
  • Performed Integration Test, Regression Test and Performance test of the data warehouse application. Participated and contribute in system test and defect tracking status calls.
  • Used HP Quality Center, Informatica Power center, SSIS, Oracle, PL/SQL, Teradata, UNIX scripts, TWS scheduler in the end to end system test environment setup and execution.

Confidential

Software Test Engineer

Responsibilities:

  • Worked with the Business analysts and the DBA for requirements gathering, business analysis, testing, metrics and project coordination.
  • Involved in gathering business requirement, Studying the application and collecting the information from developer and writing the system and user acceptance test plan, designed the requirement using MS - Excel.
  • Performed Web Application Testing and Database Testing.
  • Performed the different type of tests such as Integration tests, Regression tests, User Acceptance tests.
  • Worked with developers who extensively used Cognos report (BI tool) for testing reports.
  • Involved in Database testing by writing & executing SQL queries to validate that data is being populated in an appropriate tables & manually comparing the results with front-end values.
  • Validate Cognos report as per the requirement by using backend (SQL).
  • Tested Informatica mappings individually as well as with the entire process, Tested Workflows.
  • Associated with Production support team in various performances related issues.

Confidential

Software Engineer

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Involved in code migration.
  • Performed code execution in test environment.
  • Performed BI Report Testing (Cognos Report), Cube Testing.
  • Involved in writing & executing test cases to validate the target table loaded by ETL.
  • Involved in BI Report Testing (Cognos Report).
  • Used Test Complete to store all testing results, metrics, implemented Test plan Document, created Test Cases.
  • Created and executed the test cases for various scenarios, and participated actively in system, integration, performance and regression testing for every new build released.
  • Performed Development, Integration, System Integration, End to End and User Acceptance Testing for the data services.
  • Preparing status reports and defect reports.
  • Managing the defects in the defect tracking tool HP Quality Center.
  • Assigned those bugs to Programmer by using Quality Center a bug tracking tool.

We'd love your feedback!