We provide IT Staff Augmentation Services!

Etl Lead Resume

5.00/5 (Submit Your Rating)

Irving, TX

PROFESSIONAL SUMMARY

  • Dynamic professional with 12+ years of experience in Implementing Data warehouse Application worked in areas of Analysis and Design, Development, Testing and support.
  • Technical expertise in Informatica PowerCenter 9.x/8.x/7.x, Informatica Data Quality IDQ 9.x, Informatica Power Exchange.
  • Experience working with PowerCenter Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Worked extensively on XML and web services using SOAP for Data Integration.
  • Strong Informatica performance tuning experience with respect to partitioning, pushdown optimization, source, target, mapping tuning, Exposure in installation of Informatica and configuration of Informatica PowerCenter 9.5.1
  • Worked with other ETL tools like DataStage 8.5, SSIS - 2005 (MSBI) and Experience in converting DataStage, SSIS jobs to Informatica.
  • Extensive experience with Oracle Relational Database Management Systems (RDBMS) including writing triggers, PL/SQL Stored Procedures, Packages, SQL Performance tuning.
  • Worked on flat files, XML, Oracle 11g/10g/9i/8i, Netezza, MS SQL Server 2005/2008, Teradata 13/14, Sybase, DB2 as part of Data integration sources and targets.
  • Experience in writing Teradata B-TEQ scripts, SQL server Stored procs in T-SQL, UNIX SHELL scripts, Python programming.
  • Experience in building operational data stores (ODS), data marts, and enterprise data warehouses. Worked in Projects with Data Warehousing, Data integration, ETL conversions, Divestitures.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas, Slowly changing dimensions Type1/2/3 dimension mappings. CDC, Dimensional Data Modeling, Physical and Logical Data Modeling.
  • Developed reporting applications using Cognos Business Intelligence components like Report Studio, Framework Manager, Query Studio, Analysis Studio and Dynamic Cubes.
  • Good Exposure in Financial Services, Insurance and SCM Business Domains
  • Excellent communication, interpersonal and analytical skills, Quick learner and adaptive to new and challenging environments.
  • Exposure in various business processes for onsite offshore model followed by different clients. Worked in Agile and SDLC.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.x/8.x/7.x, Informatica IDQ 9x, Informatica power Exchange, DataStage 8.5, SSIS 2005

Databases: Oracle11g/10g/9i, Teradata 14/13, DB2, MS SQL Server 2008/2005, Netteza, Sybase.

BI Tools: Cognos 10, 8.4.1 (Framework manager and Report Studio). Business Objects BO-XI R2

Programming: UNIX Shell Scripting, SQL and PL/SQL, Python, Perl, Java, T-SQL

Environment: Sun - Solaris, IBM-AIX, HP-UX, Windows 2008 R2/2000/NT

Data Modeling: Erwin 7.0

Other Tools: TD SQL Assistant, Toad, Forms 6i, Quality Center, Remedy, HPSM, SOAP UI, XML Spy, MS Visio

Scheduling Tools: Autosys, Tivoli, Control M

PROFESSIONAL EXPERIENCE

Confidential, Irving, TX

ETL Lead

Responsibilities:

  • Analysis, Design and document the ETL solution and mapping documents for processing of the Data Integration into Data marts Encounters, Party.
  • Design, Implemented end to end interfaces for Wellcentive cloud analytics for vendor like eCW, Athena, BCBS.
  • Development of Informatica mapping, workflows, and reusable components for maplets, worklets.
  • Need to implement the Data Quality rules for data standardization.
  • Deployments of the jobs in various environments DEV, TUT, UAT
  • Unit testing and integration testing for all the developed jobs and Validate testing results and preparing the test results documents for SOX evidence.
  • Responsible to develop the required Python/shell scripts for input and output file cleansing, archival management.
  • Performance tuning of mapping and also SQL queries used in stored procedures with explain plan and other pre and post SQL, SQL overrides.
  • Implemented ABCE Frame work for Exception Handling strategies to capture errors during loading processes, to notify the exception records to the source team and automating the processes for loading the failed records to warehouse and missing files.
  • Upgrade Informatica from 9.5 to 10.1 and implemented security groups for LDAP in all the env Dev, TEST, PROD.
  • Working with large tables with 60 million rows and DWH of size 20 TB.
  • Extensively worked Tidal job creation and scheduling and adding calendars according to the client requirement
  • Extraction of data from sources like Teradata, SQL server, DB2, and developing workflows for real time MQ.
  • Write/analyze functional requirements based on obtained business requirements and design ETL/DQ solutions aligned with informational architecture and data governance.
  • Developed mapping that extracts data from sources like Medictech, EPIC, McKesson.
  • Implement data integration best practices and conventions, code review, Design testing strategy and elaborate test data sets.
  • Created BTEQ scripts for Teradata used utilities FastLoad, MultiLoad.

MENT & TOOLS: Informatica 10.1, Power Exchange, IDQ 10.0, Teradata 15, Linux, SQL Assistant, TD Studio, Microstrategy, ERWIN 10.0, Tidal, Meditech, McKesson, Athena, ServiceNow.

Confidential, Lake Oswego, OR

ETL Architect

Responsibilities:

  • Analysis, Design and document the ETL solution and mapping documents for processing of the Data Integration and POU mart in Finance DWH.
  • Design, Implemented end to end DWH cycle for POU (point of Usage) mart which is also a source for downstream application, Oracle - Demantra (Demand planning tool).
  • Development of Informatica mapping, workflows, and reusable components for maplets, worklets.
  • Need to implement the Data Quality rules requested by business in the incoming transformation stages.
  • Deployments of the jobs in various environments DEV, TUT, UAT
  • Unit testing and integration testing for all the developed jobs and Validate testing results and preparing the test results documents for SOX evidence.
  • Responsible to develop the required shell scripts for input and output file and archival management.
  • Performance tuning of mapping and also SQL queries used in stored procedures with explain plan and other pre and post SQL, SQL overrides.
  • Designed and implemented a swap view procedure to make the fact available while loading, making it available to reporting layer.
  • Implemented parallel execution using multiple instance execution.
  • Develop scripts of data loading reconciliation stats and make them available in operational reports.
  • Working with large tables with 60 million rows and DWH of size 20 TB.
  • Extensively working with Tivoli job creation and scheduling and adding calendars according to the client requirement
  • Conversion of existing EPM custom Datastage server jobs to Informatica mappings, workflows.
  • Write/analyze functional requirements based on obtained business requirements and design ETL/DQ solutions aligned with informational architecture and data governance.
  • Developed mapping that extracts data from sources like EPIC HCCL, McKesson, PeopleSoft and LDGL (DB2).
  • Implement data integration best practices and conventions, code review, Design testing strategy and elaborate test data sets.
  • Worked with Confirmed, Role playing dimensions, When Merging POU Fact with McKesson data.
  • Created BTEQ scripts for Teradata used utilities FastLoad, MultiLoad.

E NVIRONMENT & T OOLS: Informatica 9.5/9.1/8.6, Power Exchange, IDQ 9.5, Teradata 14/13, Oracle 11g, DB2 8.4, IBM AIX UNIX, SQL Developer, SQL Assistant for TD, Cognos 10 report studio, Oracle Golden gate replication, ERWIN 7.0, IBM Tivoli, EPIC, McKesson.

Confidential - New York, NY

Technical Analyst

Responsibilities:

  • Analysis, Design and document the ETL solution and mapping documents for processing of the Data Integration LCDB DWH (Legal Compliance DB).
  • Developed mapping, workflows and Stored Procedures for populating Product reference data from Source like Bloomberg, POETS, GMI, Securities Master Data (SMD).
  • Implemented Data remediation solutions for the current issues in position drops for various feeds reported by downstream.
  • Improved Batch performance by implanting partitions in Sessions and SQL tuning in SQ and target, lookup tuning, Push down optimization.
  • SQL tuning using hints and other resources like explain plan, trace files.
  • Developed shell scripts for flat file processing and ftp drop box poling, archival process.
  • Implemented dynamic parameter files generation and merge between session of workflow.
  • Created new jobs in Control - M, Perfumed job ordering and other batch activates for various regions APAC, EMEA, US.
  • Developed Orders, Trades and Position feeds for new interfaces.
  • Preparing and using test data/cases to verify accuracy and completeness of ETL process.
  • Developing code snippets for input file consolidation using python.
  • Coordinate downstream to develop regulatory reports.
  • Developed mappings, Process Sequence, Dictionaries, reference tables, rules
  • Data profiling and scorecard in collaboration with Data Architect
  • Complex quality rule development and implementation patterns with cleanse, parse, standardization, validation, exception, notification and reporting with ETL and Real-Time consideration.

E NVIRONMENT & T OOLS: Informatica 9.x/8.6, Informatica IDQ 9.1, Sybase, Greenplum, XML, FpML Oracle 11g, UNIX, SQL Developer, ALL Round PLSQL Developer, Oxygen XML Editor, Oracle Golden gate replication, python, MS Visio, Control - M.

Confidential, Denver, CO

Sr ETL Developer / Lead

Responsibilities:

  • Design and development of MTD and Returns Data Marts and Cube for analytics.
  • Worked on End-to-End Data warehouse life cycle.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Extract data from heterogeneous sources like Oracle, SQL Server, SAP, Salesforce and Legacy systems and loading them into SQL server DWH.
  • Create detailed Technical specifications for Data Warehouse and ETL processes.
  • Review to ensure coding and design standards are met
  • Work allocation to team members, mentoring, and helping in issues.
  • Create the complex XML Splitter to split the large XML files generated.
  • Work with web services to extract RMA data from different vendors.
  • Data Profiling and Analysis of source data, Implement the Data Quality rules of Confidential for all the incoming feeds from various vendors to standardize the data quality.
  • Developed complex mappings with various Informatica transformations like Aggregator, Lookup, Source Qualifier, Update Strategy, Router, Joiner, Filter and Expression.
  • Wrote the complex SQL queries, stored procedures to retrieve the data from different sources to validate.
  • Designed and developed efficient Error Handling methods and implemented throughout the mappings.
  • Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like Oracle, MS SQL Server, XML, Flat files into the staging area and Design ETL processes that span multiple projects
  • Created various SQL server database objects like Indexes, stored procedures, views, and stored procedures / T-SQL code snippets for DWH
  • Created reusable maplets and worklets.
  • Used Informatica Debugger for troubleshooting Informatica mapping logic.
  • Populated error tables as part of the ETL process to capture the records that failed the transformations.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations
  • Implemented various Data Transformations using Slowly Changing Dimensions
  • Developed test cases for Unit, Integration and system testing
  • Involved in Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.
  • Partitioned the Sessions for better performance.
  • Wrote SQL Scripts and T-SQL Scripts to extract data from Databases

E NVIRONMENT & T OOLS: Informatica 8.6, Oracle 10g/11g, SQL Server 2005, SSRS, SSAS, GreenPlum, Windows 2003, SQL developer, SAP, Sales Force, Sun OS UNIX (solaris), Linux, Autosys, Perl. OEB (e-Business Suite)

Confidential, Raleigh, NC

Etl Developer

Responsibilities:

  • Worked on End to End Data warehouse life cycle.
  • Create the mapping specification templates for extracting data for different Applications.
  • Built jobs to Extract, Transform and Load data from and various databases using Informatica Power Center.
  • Maintaining the Audit Repository database trimming, shrinking and backup
  • Created shell scripts to FTP the files from enterprise systems
  • Worked with testing team to understand the validation requirements to design mappings.
  • Responsible for overseeing the Quality procedures related to the project and also unit testing
  • Developed complex mappings with various Informatica transformations like Aggregator, Lookup, Source Qualifier, Update Strategy, Router, Joiner, Filter and Expression.
  • Involved in admin tasks like analyzing Table Space requirement, load balancing and Performance
  • Wrote the complex SQL queries to retrieve the data from different sources to validate.
  • Designed and developed efficient Error Handling methods and implemented throughout the mappings.
  • Used Dynamic SQL programming technique to build and process SQL statements "on the fly" at run time.
  • Tuned performance of Informatica sessions for large data files by implementing pipeline partitioning, Push down optimization technique and increasing block size, data cache size, sequence buffer length and target based commit interval and resolved bottlenecks.
  • Used Informatica Tool to integrate Data from Legacy Data providers and third party vendors for the large volumes of data.
  • Wrote UNIX shell scripts and ran as Pre-Session, Post-Session scripts.
  • Developed Business Objects Universe, Reports for various Balance Scorecards, Reports for monthly, Quarterly, Yearly, Charts, Bar Graphs, Drill Down reports and Trend Reports.
  • Prepared test data for data driven tests for testing the application dynamically.
  • Reviewed computer logs (UNIX logs & Shell Scripts), reported program-processing errors.
  • Used Autosys to schedule the jobs.
  • Assisted in updating the logical model with all related entities, attributes and relationship with each entities based on the rules provided by the business manager using Erwin 4.0.

E NVIRONMENT & T OOLS: Informatica Power Center 8.x /7.1.3, Erwin 4.0, Windows 2000, Oracle 9i/10g, SQL Server 2000, PL/SQL, Mainframes, DB2, Flat file, TOAD, Sun OS Unix, Linux, Perl, Autosys, Business Objects XI

We'd love your feedback!