We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

3.00/5 (Submit Your Rating)

Mclean, VirginiA

PROFESSIONAL SUMMARY:

  • Over 6 years of IT experience in Enterprise Application Integration (EAI) and Data Conversion/Migration projects.
  • Certified AWS Architect and Developer Associate and Scrum certified.
  • Worked extensively on ETL/BI and Project management tools - Informatica Power Center, Data Analyzer, Informatica Data Quality, SQL, Master Data Management, Tableau, Version One, JIRA, Visio for developing, administering, data profiling, analysis, dashboards, project management and reporting.
  • Experience in Agile/scrum methodologies for requirements gathering, data analysis, design, Implementation , modification, Sessions and Report Analysis , managing master/ data, test data management, version control, job scheduling (Autosys) , performance tuning & regressive code review & testing of the ETL/Data warehouse projects.
  • Proven expertise in Extraction, Transformation and Loading ( ETL ) of data with heterogeneous source/target systems like RDBMS (DB2, Oracle, SQL Server, and Sybase) and files systems (Flat files, Excel, XML, SAP R3).
  • Good Understanding in Data Modeling of Star/Snowflake Schema Design in Erwin tool, OLAP, Data Marts, Relational/Dimensional Data Model, Slowly Changing Dimensions ( SCD ), Change Data Capture (CDC), Fact/Dimensional tables.
  • Knowledge on NoSQL database ( MongoDB ) and HIVE (Hadoop), Amazon Web Services - S3, VPC, EC2, Route53, Redshift, DynamoDB concepts.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2, 9.5/9.1/8.6/8.1 , Data Analyzer, Informatica Data Quality 10.2 (IDQ), IBM OPTIM (Data Masking), Informatica Metadata Manager, Informatica Power Exchange, Master Data Management (MDM).

Cloud technologies: Amazon Web Services (AWS-certified), Informatica Cloud

Project Methodologies: Agile (Scrum Master certified), Waterfall, Pair programming

Reporting Tools: SSRS, Tableau, Pivot tables

Test Data Management Tool: IBM OPTIM

Code Management Tool: Git

XML Tool: Altova XMLSpy, Visual Studio

MS: Visio, Erwin

Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2005/2008, IBM DB2, Sybase, Mainframe

Database Utilities: IBM Data Studio, Toad, SQL Developer, Rapid SQL, DBArtician.

Scheduling Methods: CRON, Autosys (JIL), Workload Control Center (WCC)

Languages: C, C++, SQL, PL/SQL, XML, HTML, Java

Helpdesk Tools: BMC ITSM Remedy Suite 7.6, Service Now

Project Management Tools: JIRA, Version One, Microsoft Visio

Defect Tracking Tool: HP Application Lifecycle Management (ALM), JAMA

Scripting: UNIX, Python

Data Transfer Tools: Databridge6.0/6.1(Attachmate tool), FTP, WinScp

Others: MATLAB, GLOMOSIM, Agilent ADS, MS Office Suite, Data Access.

EXPERIENCE:

Confidential, McLean, Virginia

Senior ETL Developer

Responsibilities:

  • Work with Business Analyst, architects and clients for gathering and understanding business requirements.
  • Participates in requirements definition, solution development and the implementation phases of the project.
  • Experienced in generating Multi level XML containers files.
  • Develop ETL logics for data conversion of Confidential business data from sources like Sybase, Mainframe DB2, Db2 UDB, SQL server, and Flat Files to XML format data to be consumed by CSS.
  • Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser, Java transformation to transform XML/PDF files.
  • Implemented ETL Code Migrations and other Informatica Admin tasks between DEV, QA, PRD environments maintaining code integrity. Identified, tracked, reported, and resolved data and ETL code issues.
  • Work on Altova XMLSpy and Java in eclipse to validate XML files against XSD’s.
  • Perform metadata analysis using Informatica metadata manger and created conversion logics in ETL Tool - Informatica v9.6.1.
  • Usage of parameter File concept for initializing the mapping parameters and Connection Strings.
  • Create DSN connections with SAP Mainframe DB2, Sybase systems.
  • Perform data base activities- Running DDL, creating triggers, Indexes in Rapid SQL and SQL server management Studio.
  • Follow agile project methodology with 3 week sprints with tasks being recorded in VersionOne tool.
  • Good Understanding in Data Modeling using Star/Snowflake Schema Design, OLAP, OLTP, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions(SCD), Fact and Dimensional tables, Master Data and transactional data management.
  • Performed project management tasks - CR analysis, creating WSR, WBS for the project.
  • Have developed Job Information Language (JIL) Scripts for scheduling Informatica Jobs in Autosys Scheduling tool. Experience working in SCRUM and agile work methodologies.
  • Performed unit, integration and performance tuning and testing of transformations, Sessions, mappings and workflows and written Test plans and strategies for probable data conditions to ensure the stability and usability of the Informatica code/design as per the requirements.

Environment: Informatica Power Center Client tools V9.6.1, UNIX, SQL, RapidSQL, Altova XMLspy, Sybase, Mainframe DB2, Version One, Integration & code Testing, ServiceNow, MS Visio, MS Office Suite, Autosys, Workload Control Center, Eclipse, Java, GITHub, AWS.

Confidential, Confidential, Connecticut

Informatica Developer

Responsibilities:

  • Collaborate with business analysts and solution architects for gathering conversion requirements.
  • Worked for Data conversion (ETL) of legacy data from various social services systems of Confidential to be integrated to the new application - ImpaCT.
  • Perform data analysis of source systems -target system mappings.
  • Design data conversion logic for data mappings to extract and transform legacy data to target databases.
  • Implemented CDC (Change Data Capture) using mapping variables.
  • Develop Extraction, Transformation, loading (ETL) processes with Informatica Power Center leading to full life cycle development including requirement gathering, analysis, design, development, testing and documentation.
  • Worked in Informatica Power Center v9.5/9.6.1 for ETL code design and Implementation.
  • Worked with IBM Data Studio, SQL Developer for connecting DB2 and SQL Server systems.
  • Review ETL Informatica codes and Unit tested the ETL flows and recorded ETL tasks using Atlassian tool - JIRA.
  • Created dynamic parameter file to be used in all ETL mappings on daily basis.
  • Worked in table mapping, exception data handling and data profiling and authored Requirements matrix ( RTM ), Technical Architecture ( TAD ) and Mapping Specs Documents ( MSD ).
  • Have worked on Informatica - ImpaCT Application Integration to ensure data has been fed and replicated in the Application front end.
  • Worked on IBM Data Masking tool - Optim, to ensure the state PII data is not exposed on the wire.
  • Created Technical Architecture Design (TAD) Document explaining the complete structure of the data Conversion process from Legacy to the Base systems.

Environment: Informatica Power Center Client tools V9.6/9.5, UNIX, SQL, IBM Optim, SQL Developer, TOAD for SQL Server 2000/05/08 & DB2 systems, IBM Data Studio, MS Visual Studio SSRS, Integration & code Testing, JIRA, JAMA, MS Visio, SQL Developer, MS Office Suite, job scheduling.

Confidential

ETL Engineer (Software Engineer)

Responsibilities:

  • Worked in Informatica Power Center 8.6/9.1/9.5 for ETL code design and Implementation.
  • Worked with heterogeneous source/target systems like Flat files, Excel, XML, SAP R3, DB2, Oracle, SQL Server for developing ETL code per the business requirements.
  • Created complex mappings and configured workflows/sessions using Power Center Client tools.
  • Used Type1 and Type 2 SCD mapping to update slowly changing dimensions.
  • Used Informatica Debugger to debug mappings and implemented Parameter files to centralize all parameters/variables.
  • Created deployment groups and performed objects migration (sessions, folders, workflows, mapping) whenever requested by application teams.
  • Familiar with Job scheduling methods - CRON and Autosys.
  • Worked in end to end test implementation on data objects in Informatica using test cases plans to ensure the existence on only valid ETL code as per business requirements.
  • Created metadata reports using Data Analyzer, data profiling using Informatica Data Quality (IDQ), data integration (Informatica, SSIS) and reporting (SSRS, Cognos, tableau).
  • Worked on Data profiling to establish the Data Quality Rules using Informatica Data Quality (IDQ).
  • Automated the ITSM reports on the service calls (Incident, Change, Problem tickets) reports with BMC ITSM Remedy Suite to share comprehensive status update with the teams.
  • Prepared documentation for ETL strategies & code movement, source versus destination system mappings involved in building and supporting the extraction flows.
  • Worked on Visual presentation of data using Dashboards (Tableau).
  • Provided 24X7 on-call Production support for the applications on rotation basis.
  • Configured Integration services nodes, repository services and source/target connections. And managed permissions and access privileges to the users and creating user, group, folder accounts in Admin Console and Repository Manager Tools.

Environment: Informatica Power Center Client tools V8.6/9.1/9.5, UNIX, Data Analyzer, SQL, Informatica Data Quality(IDQ), TOAD for Oracle, SQL Server 2000/05/08 & DB2 systems, MS Visual Studio SSRS & SSIS, Integration & code Testing, BMC ITSM Remedy Suite 7.6, MS Visio, CRON, Autosys, Tableau, MS Office Suite.

Confidential

Informatica Analyst

Responsibilities:

  • Extracted/Loaded data from various systems like DB2, SQL Server, Flat files into Oracle data warehouse.
  • Designed data flow business architecture to show the master and transactional data run between applications within business using MS VISIO.
  • Created Persistent Lookup cache modified the look up SQL query and tuned transformations and implemented Incremental aggregation for better performance.
  • Managed Informatica PC 8.6, Metadata Manager, and Data Analyzer in DEV, QA and Prod environments.
  • Used to create users, Groups and grant permissions/privileges to access Informatica Repositories across DEV, QA, PRD environments in Admin Console.
  • Good understanding of ITIL and attended service requests (Incident, Change & Problem management) in adherence to the SLA with BMC ITSM Remedy Suite v7.6 .
  • Performed Data Quality Analysis using Informatica Data Quality (IDQ) to validate data.
  • Performed data objects validation across QA and Production Environment to make sure no invalid data exists.
  • Worked on heterogeneous Database platforms - Oracle 10g/11g, SQL Server 2005, UDB DB2.
  • Used to take back up of repositories and truncate metadata files during Informatica Maintenance.
  • Resolved Service Requests/calls from application teams with timely updates regarding Data Object migration between DEV, QA and PRD environments, Connection requests, Data issues using BMC ITSM Remedy Suite.

Environment: Informatica Power Center Client tools V8.6/9.1, UNIX, SQL, Informatica Data Quality(IDQ), Data Analyzer, TOAD for Oracle, SQL Server 2000/05/08 & DB2 systems, MS Visual Studio SSIS, BMC ITSM Remedy Suite 7.6, MS Visio, SQL Plus, MS Office Suite.

Confidential

Research Associate

Responsibilities:

  • Designed and implemented Wireless networks using Glomosim, Opnet systems.
  • Worked on designing and simulating RF components such as resonator, filters in Agilent Advanced Design System (ADS) system.
  • Extensive research done on RF Bulk Acoustic Wave technology.
  • Created and tested the data accuracy by creating complex Oracle SQL’s using analytical functions.
  • Loaded data from various sources like DB2, SQL Server, Flat files into Oracle data warehouse.
  • Scheduled Informatica Jobs and supported Application teams in ETL design development process.
  • Prepared test data/cases to verify accuracy and completeness of ETL process.
  • Developed ADS designs and MATLAB programs to visualize the data based to intensity and frequency of data loads and signal strength to implement wireless network.
  • Research paper published in Advanced Material Science Journal, March 2011, China.

Environment: GlomoSim, Opnet, Agilent Advanced Design Systems, MATLAB, C programming, Visual Basic, J2EE.

We'd love your feedback!