We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Around 7 years of Data Warehousing experience using Informatica Power Center 10.x/9.x/8.x/7.x, Warehouse Designer, Oracle, DB2, Power Analyzer, Power Plug, Power Exchange, ETL, Data Mart, OLAP, OLTP.
  • Extensive experience in Business Analysis, Application Design, Data Modeling, Development, Implementation and Testing Data warehouse and Database applications for Health Care, Insurance, Commercial and Capital Market/Financial Services.
  • Extensive experience in creating the Workflows, Worklets, Mappings, Mapplets, Reusable transformations and scheduling Workflows and sessions using Informatica Power Center.
  • Knowledge in Informatica cloud using CMD (Cloud Mapping Designer), DRT (Data Replication Tasks) and DST (Data Synchronization Task).
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Experience in defining and configuring landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies and foreign - key relationships.
  • Experience in configuring Entity Base Objects, Entity Types, Relationship Base Objects, Relationship Types, Profiles using Hierarchy tool.
  • Good knowledge with Teradata utilities BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD and TPUMP.
  • Developed Mappings, Sessions and Workflows for relational & flat file sources and targets.
  • Proficiency in data warehousing techniques for data cleansing, slowly changing dimensions types (1, 2 and 3).
  • Experienced in Installation, Configuration, and Administration of Informatica Power Center/Power Mart Client, Server.
  • Extensive experience in all areas of Project Life Cycle including requirements analysis, system analysis, design, development, documentation, testing, implementation and maintenance.
  • Experience in integration of various data sources like IMS, Oracle, DB2, SQL server, Flat Files, MQ series and XML Files into ODS and extensive knowledge on Oracle 12c/11g/10g/9i/8i, DB2 8.0/7.0, Teradata, MS SQL Server 2005/2000/7.0/6.5 , Sybase 12.x/11.x, MS Access 7.0/2000.
  • Expertise in SQL/PLSQL in developing & executing Stored Procedures, Functions, Triggers and tuning on queries while extracting and loading data.
  • Have experience in creating documents like support documents, Unit test documents.
  • Experience in Performance Tuning and Debugging of existing ETL processes.
  • Experience in UNIX shell scripting and configuring Cron jobs for Informatica job scheduling, backup of repository and folder.
  • Exposure in MicroStrategy schema & application layer, report analysis & report services (dynamic dashboards) design, development, implementation, migration and testing.
  • Exposure in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables using Tableau.
  • Strong experience in interacting with business users, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and Analyst tool.
  • Having knowledge about data replication tool.

TECHNICAL SKILLS

Operating System: Solaris, HP-UX, Red Hat Linux, Windows NT/2000/XP Packages: MS Office Suite, MS Visio, Rational Rose, Lotus Notes, XML, Spy

ETL: Informatica Power Center 10.x/9.x/8.x/7.x, Informatica Power Exchange 9.x./8.x, SSIS

Dimensional Data Modeling: Dimensional Data Modeling using Star & SnowFlake schema, Ralph Kimball, Erwin.

Reporting Tools: MicroStrategy, Tableau, COGNOS Series 8/7, Cognos Decision Stream 7

Programming Languages: C, C++, Java, VB, SQL, PL/SQL, UNIX Shell Scripting.

DB Tools: TOAD, SQL Navigator, SQL LOADER.

RDBMS: Oracle 8/8i/9i/10g/11g/12c, Oracle EBS R12 SQL, MS-Access, Vertica, Teradata, MS SQLServer 2000/2005/2008 IBM DB2.

GUI: Visual Basic 6.0/5.0

Scheduling: UC4, Autosys, Control-M

PROFESSIONAL EXPERIENCE

Confidential

Sr ETL Developer

Responsibilities:

  • Analyze business requirements into a scalable data model and coordinate with Technology teams to Design and Implement Data Warehouse /Data Marts.
  • Involved in Project scope meetings and set project goals and benefits.
  • Involved in creating, monitoring, modifying, communicating the project plan with other team members.
  • Develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.
  • Worked with conceptual, logical and physical data models, data flows and databases; along with creating/maintaining conceptual, logical and physical data documentation.
  • Created and proposed technical design documentation which includes current and future functionality, database objects affected, specifications, and flows/diagrams to detail the proposed database and/or Data Integration implementation.
  • Created logical data models, designing new database schema or re-use existing structures, determining data movement strategy as well as data availability.
  • Design, Develop and Test database objects to run ETL processes that support new implementation and scripts to extract data from legacy systems, transform and load data to the Data Warehouse.
  • Define and implement the most optimal ETL process to collect data from various sources like EPIC, GE, and Banner etc. in varied formats and populate the data warehouse using SSIS packages, Functions, Stores Procedures etc.
  • Implemented source and target-based partitioning for existing workflows in production to improve performance to cut back the running time.
  • Processing the Facts and dimensions on daily basis using the SSIS tasks using SQL server agent.
  • Develop functional and integration test scripts for operations, backup and recovery, data security, etc. and conduct performance tests to optimize ETL processes.
  • Develop complex data extracts and ad-hoc queries as requested.
  • Handle end-to-end process integrations, customization, maintenance and deployment.
  • Develop ETL processes to monitor, control and maintain data integrity, accuracy and security.
  • Perform integration for operations, backup and recovery, data security.
  • Support system integration and quality assurance testing.
  • Analyze ETL processes for speed optimization and contributing to system architectural design.
  • Conduct data profiling and design target data structures (data models) in compliance with the corporate data architecture.
  • Worked closely with database administration team and ensure the database is architected and designed to handle concurrent ETL and reporting query jobs.
  • Adhere to and implement enterprise architecture concepts, principles and data access strategy with project deliverables.
  • Prepared the Migration document showing the paths for input and output files, configuration files, table scripts, stored procedures and SSIS packages

Environment: MS SQL Server Integration Service 2017 (SSIS), Tableau 10.1, TSQL

Confidential

Sr. ETL Informatica Developer

Responsibilities:

  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Efficiently implemented Change Data Capture (CDC) to extract information from numerous Oracle tables and experienced Pushdown Optimization in Informatica Power Center.
  • Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various Informatica power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.
  • Used Conversion process for VSAM to ASCII source files using Informatica Power Exchange.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL.
  • Designed data loading programs to load Asset categories and Asset key flex field values. Converted legacy data to R 12.0.6 and assets data by using Web ADI.
  • Developed Bank Statements Loading process and Setup Bank mapping and forecasting templates. Streamlined Revaluation and Translation, Payment System and month end and year end process for Oracle Applications R12.Setup Document sequence, Sourcing rules, Purchase order Marching, Approval Groups and Assignments.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files and transformed and loaded into targets using Informatica.
  • Created users, groups and gave read/write permissions on the respective Folders of repository.
  • Designed Mappings by including the logic of restart.
  • Handle any production failures occurring at run time.
  • Work on defects and provide resolution and a plan to deploy into production.
  • Work on any improvement or code change being planned to run in production.
  • Validate the test cases as well as produce a risk estimation sheet, with clear information on risk factor involved for downstream partners because of the change.
  • Coordinated with developers for prioritizing and translating requirements, defects and enhancements into product feature.
  • Created Post UNIX scripts to perform operations like gunzip, remove and touch files.

Environment: Informatica PowerCenter 10/ 9.6.3, Informatica Power Exchange 9.x., Talend, Oracle EBS R12 SQL, Oracle 12 C, PL/SQL, IBM DB2 Client, UNIX, Autosys, udeploy

Confidential, St Louis, MO

ETL Informatica Developer

Responsibilities:

  • Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document.
  • Involved in design and development of Informatica mappings, UNIX Scripts, PL/SQL Procedures, wrote SQL, PL/SQL scripts, Function and Package code.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Extracted data from various sources like Oracle, Teradata, DB2, Flat Files, Mainframes and SQL Server.
  • Extensively worked in the Performance tuning of the programs, ETL Procedures and Processes.
  • Used SQL tools like SQL Developer to run SQL queries and validate the data loaded into the target tables.
  • Applied partitioning at session level for the mappings which involved loading data to target using target lookup to avoid duplicates records.
  • Performance tuning by session partitions, dynamic cache memory, and index cache.
  • Created reusable transformations and mapplets based on the business rules to ease the development.
  • Created and modified existing batch jobs using BTEQ scripts to load data from Preserve area to Staging area.
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.
  • Efficiently implemented Change Data Capture (CDC) to extract information from numerous Oracle tables and experienced Pushdown Optimization in Informatica Power Center.
  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire's data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Worked extensively on Source Analyzer, Mapping Designer, Mapplet designer and Warehouse Designer and Transformation Developer.
  • Extracted data from source systems to a staging database running on Teradata using utilities like Multi Load and Fast Load.
  • Designed and developed Informatica Mappings and sessions based on business rules.
  • Developed several Mappings and Mapplets using corresponding Sources, Targets and Transformations.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Have extensive knowledge on SOAP, REST (GET, POST, PUT and DELETE) methods.
  • Worked largely using XML and JSON.
  • UC4 Automic for scheduling the jobs and Clearcase for migration.
  • Improved session run times by partitioning the sessions. Was also involved into database fine tuning (creating indexes, stored procedures, etc), partitioning oracle databases.
  • Created Materialized view for summary data to improve the query performance.
  • Performed unit testing, knowledge transfer and mentored other team members.
  • Been a part of the SOA (Service Oriented Architecture) team enforcing best practices for services (REST and SOAP)
  • Have been a part of the Integration team gathering requirements to provide Enterprise level Services across the company.

Environment: Informatica Power Center 9.6/9.5.1/9.1 , Informatica Power Exchange 9.x., Teradata 14.10, Oracle 11g/12c, SQL server 2005, PL/SQL, Star Schema, UNIX Shell Scripts, Flat files, UC4 Scheduling tool, JSON, SOA, REST, SOAP, Clearcase Migration tool, Tableau.

Confidential, Greenwood Village, CO

ETL Informatica developer

Responsibilities:

  • Analyzed specifications and identified source data needed to be moved to Data Mart, Participated in the Design Team and user requirement gathering meetings and Created Data Maps / Extraction groups for legacy Parent sources.
  • Designed Sources to Targets mapping from primarily Flat files to Database using Informatica Power Center.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows.
  • Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables.
  • Performed CDC capture registrations.
  • Assisted in building the ETL source to Target specification documents by understanding the business requirements.
  • Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
  • Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various Informatica power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.
  • Used Teradata utilities like fast load, multiload, fast export, BTEQ.
  • Extracted data from Teradata source systems to a flat file.
  • Reusable transformations and Mapplets are built wherever redundancy is needed.
  • Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput.
  • Creating Jobs and Job streams in Control-M scheduling tool to schedule Informatica, SQL script and shell script jobs.
  • Coordinated with developers for prioritizing and translating requirements, defects and enhancements into product feature.
  • Tuned ODBC driver to improve data transfer rates between database and MicroStrategy servers located at different physical locations.
  • Used database specific MicroStrategy OLAP pass through functions like Apply Simple.
  • Created Post UNIX scripts to perform operations like gunzip, remove and touch files.

Environment: Informatica Power Center8.6.1/9.1, SAP R/3, Oracle 11g, Teradata, Toad 10g, XML, Control-M, Mainframe (IMS), SQL,Unix, Windows XP, DB2, MicroStrategy.

We'd love your feedback!