Sr Etl/ Data Engineer Resume
Whitestone, NY
SUMMARY
- 9+ years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center Informatica 10.x/ 9.x/ 8.x, IDQ, Informatica Developer, IDE, SSIS, IDS.
- Experience in analysis, design and development of enterprise level data warehouses using Informatica. Experience in complete Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing)
- Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
- Expertise in developing standard and re - usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router.
- Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2008/2005/2000 , XML files, Teradata, Netezza, Sybase, DB2, Flat files, XML, Salesforce sources into staging area and different target databases.
- Designed complex Mappings and have expertise in performance tuning and slowly-changing Dimension Tables and Fact tables.
- Excellent working experience with Insurance Industry with strong Business Knowledge in Auto, Life and Health Care - Lines of Business
- Worked on scheduling tools Informatica Scheduler, Autosys, Tivoli/Maestro & CONTROL-M.
- Experience in PL/SQL Programming and in writing Stored Procedures, Functions etc.
- Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica 10.X,9.X/8.X/7.X/6.X
- Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, IBMDB2 UDB, XML files.
- Extensively worked on Informatica Data Quality and Informatica Power center throughout complete IDQ Project.
- Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
- Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.
- Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
- Experience in UNIX shell scripting, Perl scripting and automation of ETL Processes.
- Experience in support and knowledge transfer to the production team.
- Prepared user requirement documentation for mapping and additional functionality.
- Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
TECHNICAL SKILLS
ETL Tools: InformaticaPowerCenter10.2, 9.6/9.5/9.1/8.6/7. x/6.x, SalesForce, Informatica Cloud, Informatica Power Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Information Data Quality (IDQ) 9.6.1/9.5.1 , Informatica Power Connect and Metadata Manager, Informatica Data Services (IDS) 9.6.1, Data Stage
Databases: Oracle 12g/10g/9i/8i/8.0/7.x, Teradata13, DB2 UDB 8.1, MS SQLserver 2008/2005.
Operating Systems: UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS
Programming: SQL, SQL-Plus, PL/SQL, Perl, UNIX Shell Scripting
Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1 , Cognos Impromptu 7.0/6.0/5.0, Informatic Analytics Delivery Platform, Micro strategy.
Modeling Tools: Erwin 4.1 and MS Visio
Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, SQL Developer 1.5.1, Autosys,Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console, JIRA, Netezaa.
Methodologies: Agile, Ralph Kimball.
PROFESSIONAL EXPERIENCE
Confidential - Whitestone, NY
Sr ETL/ Data Engineer
Responsibilities:
- Requirements gathering, analyze, design, code, test highly efficient and highly scalableintegration solutions usingInformatica, Oracle, SQL, Source systems viz.
- Involved in the technical analysis ofdata profiling, mappings, formats, data types, and developmentof data movement programs usingPower ExchangeandInformatica.
- DevelopedETLmappingdocumentwhich includes implementing the data model, implementing the incremental/full load logic and theETLmethodology.
- Involved in writingTeradataSQLbulk programs and in Performance tuning activities forTeradata SQLstatements usingTeradataEXPLAINand usingTeradata Explain, PMONto analyze and improve query performance.
- Developed variousETLprocess for complete end to endData Integration anddoneData IntegrationbetweenDB2andOracleusingETLProcess
- Design, development ofmappings, transformations,sessions, workflows andETLbatch jobsto load data into Source/s to Stage usingInformatica, T/SQL,UNIXShell scripts, Control - M scheduling.
- Developed various mappings for extracting the data from different source systems usingInformatica,PL/SQLstored procedures.
- Developed mappings for extracting data from different types of source systems (flat files, XML files, relational files, etc.)into our data warehouse usingPower Center.
- UsedInformaticapower center10.1.0to Extract, Transform and Loaddata intoNetezzaData Warehouse from various sources likeOracle and flat filesandresponsible for creatingshell scriptsto invoke theinformaticaworkflows through command line.
- DevelopedUNIX Shell scriptsfor data extraction, running thepre/post processes andPL/SQLprocedures.
- UsedTeradata SQL Assistant, Teradata Administrator and PMONand data load/export utilities likeBTEQ,Fast Load, Multi Load, Fast Export, Tpump, and TPT on UNIX/Windowsenvironments and running thebatch processforTeradata.
- Developed standard mappings using various transformations likeexpression, aggregator, joiner, source qualifier, router, lookup, and filter.
- Involved in business analysis and technical design sessions with business and technical staff todevelopEntityRelationship/data models, requirements document, andETLspecificationsand Dig deep into complexT-SQLQuery and Stored Procedure to identify items that could be converted toInformaticaCloud ISD.
- Involved in gathering and documenting business requirements from end clients and translating them into report specifications for theTableauplatform.
- CreatedInformaticamappings withPL/SQLstored procedures/functions to in corporate critical business functionality to load data.
Environment: Informatica Power Centre 10.2.0, Power exchange, Oracle 12c, Netezza, UNIX, Netezza Aginity, Teradata, SQL & PLSQL, SQL Server, Korn Shell Scripting, XML, T-SQL, TOAD, UNIX, Win, LINUX, Excel, tableau, Flat Files, Tivoli.
Confidential, Columbus, OH
ETL /Data Engineer
Responsibilities:
- Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
- Responsible for developing and maintainingETLjobs, includingETLimplementation and enhancements, testing and quality assurance, troubleshooting issues andETL/Query performance tuning
- Designing, developing, maintaining and supporting Data Warehouse orOLTPprocesses via Extract, Transform and Load(ETL)software usingInformatica.
- Manage and expand currentETLframeworkfor enhanced functionality and expanded sourcing.
- Translate business requirements intoETLand report specifications. Performed error handing using session logs.
- Worked onPower CenterTools like designer, workflow manager, workflow monitor and repository manager.
- Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently.
- Done Code reviews ofETLand SQLprocesses. Worked on upgradingInformaticafrom version9.x to 10.x.
- Created complex mappings inPower Center Designerusing Aggregate, Expression, Filter, and Sequence
- Identifying performance issues in the code and tuning it to complete the jobs within the given SLA time.
- Performance tuning atInformaticaand database levels depending on the bottlenecks identified with the jobs.
- Developed mappings usingInformaticaPower CenterTransformations - Lookup (Unconnected, Connected), Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
- Effectively usedInformaticaparameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Worked withInformaticaData Quality 9.x (IDQ)toolkit, analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities ofIDQ.
- Performed Relational Data Modeling, ER Diagrams (forward & reverse engineering), dimensional modeling,OLAPmultidimensional Cube design & analysis, define slowly changing dimensions and surrogate key management.
- Worked with the testing team to resolve bugs related to day oneETL mappingsbefore production.
- MaintainedETLrelease document for every release and migrated the code into higher environments through deployment groups.
- Created change request before code deployment into production to get required approvals from business.
- Created weekly project status reports, tracking the progress of tasks according to schedule and reported any risks and contingency plan to management and business users.
Environment: InformaticaPower Center10.x/9.x,IDQ,OLAP,Oracle, MS SQL Server, PL/SQL, SSIS, SSRS, SSAS, SQL*Plus, SQL*Loader, Windows, UNIX.
Confidential, Minneapolis, MN
ETL/Informatica Developer
Responsibilities:
- Developing Informatica ETL jobs flows to extract data from Source and to load into the DataMart.
- Develop the mappings using needed Transformations in Informatica tool according to technical specifications.
- Implemented Informatica error handling, dynamic parameter file generations and coding best practices
- Involved in the performance tuning of the Informatica mappings, stored procedures and the sql queries inside the source qualifier.
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
- Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
- Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
- Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
- Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
- Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
- DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
- Created PL/SQLstored procedures, scripts, functions and packagesto extract the data from the operational database into simple flat text files usingUTL FILEpackage.
- Design the Source - Target mappings and involved in designing the Selection Criteria document.
- Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
- Responsible for manually start and monitor production jobs based on the business users requests.
- Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
- Prepared ETL Specifications and design documents to help develop mappings.
- Created Mappings for Historical and Incremental loads.
- Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
- Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
- Designed and developed Informatica mappings for data sharing between interfaces utilizing SCD type1, type 2, type 3 methodologies.
- Develop informatica mappings and workflows for new and existing requirements.
- Worked on Teradata queries, macros, utilities like MLOAD and FLOAD. Tuned the queries and procedures.
- Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.
Environment: Informatica PowerCenter 10.0/9.6, PowerExchange 9.6, Informatica Data Quality 9.6.1, Oracle 11g, Erwin, VSAM, DB2 mainframe, Flat files, T-SQL, Teradata 13, Toad, Qlickview 11, Control-M, SQL server, SQL* Plus, TOAD, SQL*Loader, Toad for Oracle, Tableau, Unix Shell scripts.
Confidential, FL
ETL Informatica Developer
Responsibilities:
- Work closely with team of business users to gather business requirement and translate into technical specification.
- Created profile, rules and mappings withInformaticaDeveloper and Analyst Tool.
- Strong in implementation of data profiling, creating score cards, creating reference tables and documenting Data Qualitymetrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
- Worked withInformaticaData Quality 9.6.1 (IDQ)toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities usingIDQ9.6.1.
- Performed the data profiling and analysis making use ofInformaticaData Quality (IDQ).
- Worked on different transformations likecase converter, standardizer, labeler etcfor cleansing and standardize the data.
- PerformingInformaticaAdmin taskssuch as MigratingInformaticacode between different environments, Folder copy, creatingInformaticarepository queriesto query metadata information etc.
- CreatingETLmappings to sendMedicaiddata toMMISand generating error reports from MMIS.
- CreatingWeb services consumer transformationto invoke web services for Address scrubbing.
- UsingIDQto performdata profiling, data cleansing, creating reference tables, scorecardsetc.
- Wrote and ran different type ofSQLqueries, including create/insert/update on given databases
- Wrote complexSQLoverride scripts at source qualifier level to avoidInformaticajoiners and Look-ups to improve the performance as the volume of the data was heavy.
- Performance tuningofInformaticamappings for large data files by managing the block sizes, data cache sizes, sequence buffer lengths and commit interval
- CodingUNIX shell scriptsas file watcher, and to move the flat files between various locations postInformaticasessions.
- Developing mappings based onChange Data Capture (CDC)techniques.
- UsingTOAD, IBM Data Studio, DBeaveretc to analyze the existing data and to design SQL queries for mappings
- Involved inUnit, System integration, User AcceptanceTestingof Mapping.
- Supported the process steps under development, test and production environment.
Environment: InformaticaData Quality 9.6.1,InformaticaPower Center 9.6.1/9.5.1 SQL, Oracle 11g/10g, Autosys, QL Server, SQL* Loader, SQL developer, PLSQL developer, TOAD, Putty, PL/SQL, UNIX Shell Scripting, Flat Files, Control-M.
Confidential, Hartford, CT
Informatica Developer
Responsibilities:
- Involved in business analysis and technical design sessions with business and technical staff to develop
- Requirement’s document and ETL specifications.
- Involved in designing dimensional modeling and data modeling using Erwin tool.
- Created high-level Technical Design Document and Unit Test Plans.
- Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
- Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
- Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
- Prepared user requirement documentation for mapping and additional functionality.
- Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
- Analyzed current system and programs and prepared gap analysis documents
- Experience in Performance tuning & Optimization of SQL statements using SQL trace
- Involved in Unit, System integration, User Acceptance Testing of Mapping.
- Supported the process steps under development, test and production environment
Environment: Informatica Power Center 8.1.4/7.1.4 , Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case