We provide IT Staff Augmentation Services!

Etl Lead Resume

2.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 8+ years of IT experience in Analysis, Design, Development and Maintenance of various Business applications.
  • Strong experience in developingETL mappingsand scripts usingInformatica PowerCenter 9.x/8.x/7.x/6.x, Change Data Capture Informatica Power Connect MQ Series.
  • Experience in working wifbusiness analysts to identify study and understandrequirements and translated them into ETL code in Requirement Analysis phase.
  • Experience in Banking,Pharmaceutical and Telecommunications Domains
  • Experience in creatingHigh Level Design and Detailed Designin the Design phase.
  • Expertise in Business Model development wifDimensions, Hierarchies, Measures, Partitioning, Aggregation Rules,Cache Management.
  • Extensively worked on theETL mappings, analysis and documentation ofOLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially wif large data sets.
  • Strong hands on experience usingTeradata utilities (SQL, B - TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.
  • Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.
  • Well versed inOLTP and OLAP Data Modeling, DataMart and Data warehousing concepts.
  • Strong knowledge ofEntity-Relationship concept, Facts and dimensions tables, slowly changing dimensionsandDimensional Modeling (Star Schema and Snow Flake Schema).
  • Well versed wif Operational Data Store, Data warehousing, Data mart and data cleansing techniques.
  • Experience in integration of various data sources likeOracle, Sybase, SQL server and non-relational sources likeflat filesinto staging area.
  • Extensively worked on developing and debugging Informatica mappings, mapplets,sessions and workflows.
  • Experience in creatingReusable Transformations(Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter,UPDATE Strategy, Sequence Generator, Normalizer and Rank) and MappingsusingInformatica Designerand processing tasks usingWorkflow Managerto move data from multiple sources into Targets.
  • Experience in creatingReusable Tasks(Sessions, Command, Email) andNon-Reusable Tasks(Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
  • Exposure to Java.
  • Experience wif Informatica Advanced Techniques - Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.
  • Extensively used Control-M, Autosys and Tivoli for job management
  • Experience wif databases like Teradata, SQL Server and Oracle 9i/11g.
  • Worked on generating reports using Microstrategy and Qlikview
  • Experienced inUNIXwork environment for file transfers,job schedulingand error handling.
  • Strong working knowledge on Kalido DIW and Kalido MDM and its architecture.
  • Good Experience in SQL, T-SQL,PL/SQL and database joins etc.
  • Experience in support and knowledge transfer to the production team.
  • Experience in defining, creating, documenting, verifying and executing Test cases, create basic test plan and performing functional testing, Integration testing and performance testing.
  • Extensive work experience on working in Onsite-Offsite model, coordinating work distribution and status reporting to client.
  • Hands on experience to Technically Direct, Manage and Lead, throughout the full life-cycle of the project delivery from need realization to post implementation support.
  • Team player and self-starter, capable of working independently, self driven and can motivate a team of professionals.
  • Quick learner and ability to meet deadlines and work under pressure.
  • Always willing to learn new tools and technologies.

TECHNICAL SKILLS

ETL Tools: Informatica 9.0.1/8.x/7.x/6.x,Informatica PowerExchange 8.x/7.x, Oxygen XML Editor 11.2, SQL server 2000/2005/2008 ,TOADSQL assistance

Database: Oracle 11g/10i/9i,Teradata,MS Access,Sybase,SQL Server

Scheduling Tools: Autosys,Contol M,Tidal

Languages: C, C++, JAVA, SQL, PLSQL

Scripting Languages: Shell script, HTML/DHTML, XML,SQL,Java script,VB Script

Operating Systems: UNIX, HP,Windows ME/2000/2003/NT/XP,IBM AIX,Sun Solaris 2.x/7/8

Testing Tools: WinRunner 7.6/8.0, QTP 9.5/10.0, Loadrunner 7.6/8.0, Quality Center

Source control tools: Rational Clear Case, EME (Enterprise Meta Environment), and PVCS

Defect Tracking Tools: Rational Clear Quest, TestDirector

Data Mining Tools: Webstatistica, Weka, Clementine, SPSS

Data Modeling: ERWIN All Fusion r7.2, ERStudio 8, Visio

Version control Tools: Microsoft Visual SourceSafe, Tortoise SVN.

PROFESSIONAL EXPERIENCE

Confidential, ATLANTA, GA

ETL Lead

Responsibilities:

  • Interacted wif the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
  • Developed ETL programs using Informatica to implement the business requirements
  • TEMPEffectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Documented the whole process involving the details about source, target and intermediate levels of loading the data.
  • Extensively usedETLto load data fromMainframes, Oracle, MQ Series,SAP and Flat FilestoOracle, Flat Files, Teradata and XML
  • Validation ofInformatica mappingsfor source compatibility due to version changes at the source.
  • Used Parallel processing capabilities,Session partitioning and Target Table partitioning utilities.
  • Made use ofPost-Session successandPost-Session failurecommands in the Session task to execute scripts needed for clean up and update purposes.
  • Tuning the performace of long runningsessionsand fixing the issues
  • Used Informatica Power Connect MQ Series to extract data from message queues, and load data to target data mart
  • Designed workflows wif many sessions wif decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Participated in data modeling and designed staging tables.
  • Preparing the migration document while migrating the data from source to target.
  • Installed and configured staging tables for new sources.
  • Formulated foreign key relationships and lookup information.
  • Executed and tested match rules on basis of application performance.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Developed Unix scripts to move the files to the archival directory and to alert the Business users in case of any file/data delays.
  • Implemented Unix scripts to send an automated email to the team if any failures in the data validation.
  • Involved inUnit testing, System testingto check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Stored reformatted data from relational, flat file, XML files using Informatica.
  • Monitoring the loads of Enterprise Datawarehouse and ensuring that the system is very consistent as well as has high level of integrity at all times.
  • Developed PL/SQL,T-SQL scripts to meet the business requirements as needed.
  • Provided a concrete explanation of resolutions to issues, inquiries and requests.
  • Managed to works in group and independently on side projects.

Environment: Informatica 9.5.1, Oracle 11i/10g, UNIX, Linux/Solaris, Teradata R13,SQL DeveloperShell scripting, Tidal

Confidential, CHICAGO, IL

Kalido /Informatica Developer

Responsibilities:

  • Analyzed business requirements and worked closely wif the various application teams and business teams to develop ETL procedures that are consistent across all application and systems.
  • Developed mappings, sessions and workflows in Informatica Power Center.
  • Identified performance issues in existing sources,Targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Documented the whole process involving the details about source, target and intermediate levels of loading the data along wif the Autosys Job flow.
  • Migrated all the sql loader scripts to Informatica mappings for staging the data.
  • Trouble shooting of long runningsessionsand fixing the issues
  • Understanding the Commercial DataWarehouse,the Sourcesystems(RGM,RGL,HFM,mappings)
  • Migration of the Model, Dimensions, Class Of Transactions, File definitions from Kalido development to QA and QA to Production.
  • Participated in data modeling and designed staging tables.
  • Preparing the migration document while migrating the data from source to target.
  • Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
  • Identifiedperformance issuesin existing sources,Targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Used Workflow Manager for creating, validating, testing and running thesequential and concurrent sessions.
  • Preparing ETL mapping Documents for every mapping andData Migrationdocument for smooth transfer of project from development to testing environment and then to production environment.
  • Preparing and usingtest data/casesto verify accuracy and completeness of ETL process.
  • Created MDM mappings, SQL queries and landing tables.
  • Installed and configured staging tables for new sources.
  • Participated in addition of new sources for MDM implementation.
  • Building JIL files in Autosys as a part of batch processing for Kalido objects and automating them as per the requirement.
  • Provided support to the monthly close Process.
  • Provided a concrete explanation of resolutions to issues, inquiries and requests.

Environment: Kalido, Informatica PowerCentre 9.1, Oracle 11g, Linux/Solaris,UNIXWindows Scripting,Shell Scripting,Autosys,Qlikview,Teradata,ERWIN,Microsoft visio

Confidential, NY

ETL Developer

Responsibilities:

  • Study and understand all the functional and technical requirementsto better serve the project.
  • CreatedMapplets, reusable transformationsand used them in different mappings.
  • Created Workflowsand used various tasks likeEmail, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Sessionin the workflow manager.
  • Validation ofInformatica mappingsfor source compatibility due to version changes at the source.
  • Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
  • Worked wif heterogeneous source to Extracted data fromOracle database, XMLand flat files and loaded to a relational Oracle warehouse.
  • Migrated repository objects, services and scriptsfrom development environment to production environment.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extensive experience in troubleshooting and solving migration issues and production issues.
  • Developed PL/SQL,T-SQL scripts to meet the business requirements as needed.

Environment: Informatica PowerCenter 8.6.1,SQL Server 2005, Erwin 4.1,Windows XPUNIX, shell Scripting, Teradata,MS Access

Confidential

Software Engineer

Responsibilities:

  • Building database applications using Oracle.
  • Took training on Informatica and Abinitio.
  • Involved in Writing PL/SQL triggers and procedures for Data Validation.
  • Done data modeling and developed Data Flow Diagrams, Entity Relationship.
  • Created table creation scripts.
  • Defined various setups for implementing different modules.
  • DevelopedUnix shell scripts, Perl scriptsand PL/SQL programs for daily and monthly data loads.
  • Involved in reengineering of the existing system-gathering requirements, source data analysis and identify business rules for data migration .
  • Developed Informatica mappings to load the data from source to target.
  • Used Ab Initio to develop graphs to extract, cleanse, transform, integrate and load data into Data Warehouse.
  • Extensively used components like Input file, Join, Reformat, Scan,Rollup, Output table, Update table components.
  • Involved in different types of testing including system testing, integration testing, Regression testing, White Box testing and Black box testing.
  • Set environment variables in Ab Initio graphs to allow portability and flexibility during runtime.
  • Used Partition methods and collecting methods to implement parallel processing.
  • Implemented Slowly Changing Dimensions SCDs of types 1, 2 and 3.

Environment: Ab Initio 2.12, Informatica 7.x, SQL, PL/SQL, Oracle,Unix.

We'd love your feedback!