We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

2.00/5 (Submit Your Rating)

Denver, CO

PROFESSIONAL SUMMARY:

  • Has 8+ years of total IT experience in Analysis, Design, Development, Testing and Implementation of Data Warehouse and Business Intelligence Solutions.
  • Experience in project scoping, leading and managing teams. Handled multiple roles - Project Lead, Technical Lead, Developer and interim Scrum Master.
  • Proven track record in planning, building, and maintaining successful large-scale Data Warehouse and Decision Support Systems. Comfortable wif both technical and functional applications of RDBMS, Data Mapping, Data loading, Data management.
  • Hands on experience in Data Modeling, Data Warehouse Design, Development and testing using ETL tools Informatica and reporting tools Cognos, BO.
  • Strong hands on experience using Teradata architecture and utilities (Fast Load, Multi-load, Fast Export, B-TEQ, and SQL Assistant).
  • Experienced in creating complex Informatica Mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
  • Worked extensively wif Informatica Designer tools Source Analyzer, Target designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Extensively worked wif XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files.
  • Hands-on experience on UNIX Scripting and Stored Procedures.
  • Has exposure on Informatica Cloud Services.
  • Experience in integration of various data sources like Flat files, Oracle, Teradata into staging area and tan into DataMart’s / EDW.
  • Performance tuning including collecting statistics, analyzing explains & determining which tables needed statistics and increased performance by 35-40% in some situations. Performing database health checks and tuned the databases using Teradata Manager.
  • Knowledge about Workload Analyzer, Database Designer, Creating and populating Vertica database.
  • Knowledge in loading data, backing up, restoring and recovery into Vertica database and Vertica installation.
  • Capture Metadata, Capacity Plan and Performance metrics for all the Teradata queries being created.
  • Provide transition to RTS team, review code, review test plan and test cases and support during production go-live.
  • Extensively worked and supported the Code Migration across different environments from Development, QA, PreProd and Production.
  • Extensive experience in full life cycle development wif emphasis on Project Management, User Acceptance Programming and Documentation.
  • Performed data validation by Unit Testing, Integration Testing and User Acceptance Testing.
  • Extensive experience in managing teams, requirement analysis, code reviews and implementing standards.
  • Possesses the ability to work autonomously as well as Team and project management experience.
  • Excellent analytical and logical programming skills wif a good understanding at the conceptual level and possess excellent presentation, interpersonal skills wif a strong desire to achieve specified goals. Excellent written and verbal communication skills including experience in proposal and presentation creation.

TECHNICAL SKILLS:

  • Informatica (10.6, 9.6.1, 9.1.0, 8.6.1, 8.1.1, 7.1.5)
  • Teradata and Utilities (V 14.01.0.02, V 13.10.0.03 )
  • Teradata SQL Assistant (V 14.01.0.02, V 13.10.0.03 )
  • Vertica
  • UNIX & LINUX
  • Master data management
  • ERwin
  • Autosys
  • IDQ
  • Oracle(RDBMS) 8,9i,10g,11,12c, oracle forms/Reports,Pl/sql,TOAD
  • OBIEE 10g/11g
  • Business Objects (XI 3.1/R2/6.5/5.x)
  • Amazon Web Services
  • SSIS,SSRS

PROFESSIONAL EXPERIENCE:

Confidential, Denver CO

Sr ETL Developer

Platforms: Teradata, Informatica Power Center 10.6, DB2, WLM, CNTRL-M

Responsibilities:

  • Prioritize workload, providing timely and accurate resolutions for the incidents raised. Perform production support activities which involve issue analysis and resolution wifin the specified SLAs.
  • Provide daily support wif resolution of escalated tickets and act as liaison to business and client manager to ensure issues are resolved in timely manner.
  • Monitor PROD jobs and dependent application job changes dat impact production support, communicate project information to the production support staff and raise production support issues to the project team.
  • Participate in noledge transfer to ensure better grasp of the product and domain.
  • Suggest fixes to complex issues by doing a thorough analysis of root cause and impact of the defect.
  • Coordinate wif Application Development Team to successfully deploy code in both User Acceptance Testing and Production environments
  • Perform Unit, Functional, Integration and Performance testing in UAT by ensuring data quality.
  • Performance tuning, including collecting statistics, analyzing explains & determining which tables needed statistics. Increased performance by 20% for one of the FT Informatica Mapping.
  • Backup the team where ever required and make sure the team is aware of the complexities of the stories and challenge the team on requirements for the estimation process.
  • Extensively worked wif XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE data type to store XML files.
  • Used major components like Serializers, Parsers, Mappers and Streamers in Data Transformation Studio for conversion of XML files to other formats.
  • Used major components like Serializers, Parsers, Mappers and Streamers in Data Transformation Studio for conversion of XML files to other formats.
  • Provided support of our Vertica infrastructure
  • Install Vertica version updates and patches
  • Work wif Analytics teams to develop business initiatives and algorithms as they relate to Vertica.
  • Working on Informatica MDM HUB development support and detailed technical noledge of Informatica MDM products
  • Working effectively in a fast-paced environment as part of a high-performing research, delivery and sustainment team
  • Contribute to continuous improvement by leveraging quality improvement methodologies.

Confidential, CA

Sr ETL Developer

Platforms: Informatica Power Center 9.6.1, DB2, Autosys

Responsibilities:

  • Involved in the requirement gathering and requirement Analysis. Interacting wif the client on various forums to discuss the status of the project.
  • Developed Informatica mappings and workflows wif complex business logics to load the data from Source to DB2 and extensively worked on validation queries.
  • Worked wif XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files.
  • Worked on UNIX Scripts for moving the encrypted files from WORM location to NM Location.
  • Performance tuning and query optimization for Informatica mappings.
  • Created low level and high level documents for the mappings created and involved in testing for both technical and functional.
  • Was responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.
  • Capture Metadata, Capacity Plan and Performance metrics for all the queries being created and define archival strategy and provide guidance for performance tuning. Fixing the performance issues and working on query optimization.
  • Perform Unit, Functional, Integration and Performance testing their by ensuring data quality.
  • 24/7 support for e-Delivery related production jobs and making sure the jobs are running in production wifout failure.
  • Drive the scrum calls on a daily basis and get the update on stories and make sure the team is working towards the commitment and also responsible for the one time deliverables.
  • Backup the team where ever required and make sure the team is aware of the complexities of the stories and challenge the team on requirements for the estimation process.
  • Scheduling jobs using Autosys and making sure they are running as defined.
  • Interacting wif third party teams like DBA and Migration teams for migrating the code to QA and PROD.
  • Involved in migration of the mapps from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems. involved in massive data profiling using IDQ prior to data staging.
  • Used IDQ to complete initial data profiling and removing duplicate data. worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.

Confidential, Gaithersburg, MD

ETL Developer

Platforms: Teradata V13, Informatica Power Center 9.1.1, Cognos

Responsibilities:

  • Involved in Design Part of the project by creating Business mapping documents.
  • Requirement study and understanding business logic and prepared low level Source to Target mapping documents.
  • Write BTEQ scripts for validation & data integrity between source and target databases and for report generation.
  • Capture Metadata, Capacity Plan and Performance metrics for all the queries being created and Define archival strategy and provide guidance for performance tuning.
  • Created and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center 8.6.
  • Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet Designer.
  • Created reusable transformations and mapplets by using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer, and Mapplet Designer, respectively.
  • Schedule jobs for ETL batch processing using Cronacle Scheduler.
  • Extensively worked wif the data conversions, standardizing, correcting, error tracking and enhancing the data.
  • Expensively worked on Stored Procedures.
  • Implemented the process for capturing incremental data from source systems and move the data warehouse area.
  • Involved in onsite offshore coordination and in performance tuning wif Informatica and database.
  • Involved in Migration of objects from DEV and QA and worked wif DBA to create schema instances and tables in DEV, QA & PROD.
  • Created updated and maintained ETL technical documentation.
  • Used to interact wif third party teams like DBA and Migration teams for migrating the code to QA and PROD.
  • Involved in Design, Development &Testing Phases and scheduling of jobs using cronacle tool.
  • Experience in design, development and implementation of AWS-based applications using services
  • Created Cloud infrastructure (EC2/S3/RDS) by configuring and deploying various AWS services in production and pre-production environments.
  • Designed suitable AWS-based solutions based on client requirements by gathering detailed functional and technical requirements, problem solution and its implementation.
  • Generate monthly order reports using OLAP functions for internal and external audits.
  • Provide project status updates to senior management detailing project status, potential project risk, and recommended mitigations.
  • Expert level skills in testing the Enterprise Data Warehouses using Informatica Power Center, Data Stage, Ab Initio, and SSIS ETL tools.
  • Experienced inExtracting, Transforming and Loading ETLdata from Excel, Flat file, Oracle to MS SQL Server by using BCP utility, DTS andSSISservices.
  • CreatedETLprocess usingSSISto transfer data from heterogeneous data sources.
  • Created logging forETLload at package level and task level to log number of records processed by each package and each task in a package usingSSIS

Confidential, Jacksonville, FL

ETL Developer

Responsibilities:

  • Involved in Design Part of the Project by Creating FSD & Business mapping documents.
  • Requirement study and understanding business logic and prepared low level Source to Target mapping documents.
  • Write BTEQ scripts for validation & data integrity between source and target databases and for report generation.
  • Capture Metadata, Capacity Plan and Performance metrics for all the queries being created and Define archival strategy and provide guidance for performance tuning.
  • Created and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center 8.6.
  • Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet Designer.
  • Created reusable transformations and mapplets by using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer, and Mapplet Designer, respectively.
  • Schedule jobs for ETL batch processing using Cronacle Scheduler.
  • Extensively worked wif the data conversions, standardizing, correcting, error tracking and enhancing the data.
  • Implemented the process for capturing incremental data from source systems and move the data warehouse area.
  • Involved in onsite offshore coordination and performance tuning wif Informatica and database.
  • Involved in Migration of objects from DEV and QA and worked wif DBA to create schema instances and tables in DEV, QA & PROD.
  • Created updated and maintained ETL technical documentation.
  • Used to interact wif third party teams like DBA and Migration teams for migrating the code to QA and PROD.
  • Involved in Design, Development &Testing Phases and in scheduling of jobs using Cronacle tool.

Confidential, Newark, CA

ETL Developer/Data Analyst

Responsibilities:

  • 23% savings in total CPU consumption
  • Approximately 10% reduction in total IO consumption
  • Query runtimes has been reduced by 30 to 50%
  • Approximately 50% savings of wasted CPU
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.
  • Worked wif Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Developed Informatica Workflows and sessions associated wif the mappings using Workflow Manager.
  • Experience in Performance tuning of Informatica (sources, mappings, targets and sessions) and tuning the SQL queries.
  • Automating the ETL process through scheduling the exception-handing routines as well as source to target mapping development, support and maintenance.
  • Migrate/upgrade Informatica version from 8.X and 9.X.

Confidential

ETL Developer

Responsibilities:

  • Creating mappings to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Normalizer, Filter and Router transformations, Union transformations, etc.
  • Responsible for Design, Data Mapping Analysis and Mapping rules
  • Responsible for project management and planning.
  • Worked on loading of data from several flat files sources to Staging using MLOAD, FLOAD.
  • Created Bteq scripts wif data transformations for loading the base tables.
  • Generated reports using Teradata BTEQ.
  • Fast Export utility to extract large volume of data and send files to downstream applications
  • Developed Pump scripts to load low volume data into Teradata RDBMS at near real-time.
  • Extensively used Informatica Power Center 8.6 to extract data from various sources, which included Flat files, Sqlserver, Oracle, MS-Access and XML.
  • Extensively used Informatica Power Center 8.6 to extract data from various sources, which included Flat files, Sqlserver, Oracle, MS-Access and XML.
  • Developed complex mappings using multiple sources and targets in different databases Oracle, DB2, flat and XML files.
  • Created BTEQ scripts to load data from Teradata Staging area to Teradata
  • Performed tuning of Queries for optimum performance.
  • Involved to understand the Business requirements wif Business Analysts and stakeholders.
  • Involved to design the blue print and approach Documents for high level development and design of the ETL applications.
  • Developed complex ETL mappings to load dimension and fact tables.
  • Worked wif type 1 and type 2 dimensional mappings to load data from source to target.
  • Worked on agile methodology and JIRA.
  • Designed blueprint and approach documents for each story and take approvals from the business analysts to design the ETL mappings.
  • Developed mappings, sessions, workflows and run the jobs.
  • To develop the mappings used all transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
  • Used reusable transformation from shared folder to implement the same logic in different mappings.
  • Involved wif testing team to resolve the issues while testing ETL applications.

Environment: Informatica Power Centre 8.1.1/7.1, FTP, windows XP, Oracle 9i, DB2, PL/SQL, SQL Server,Flat files, UNIX, Shell Scripting.

We'd love your feedback!