Sr. Etl/informatica Developer Resume
Wilmington, DE
SUMMARY
- Over 7+ years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center and Data Modeler/Data Analyst across various industries such as Healthcare, Insurance and Banking.
- Extensive experience in using Informatica Power Center 9.x/8.x/7.x to carry out the Extraction, Transformation and Loading process as well as Administering in creating domains, repositories and folder.
- Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
- Experience in all the phases ofData warehouse life cycleinvolving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Extensively worked on Informatica IDQ for data profiling, data enrichment and standardization.
- Extensive hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export, TPT), UNIX in developing tools and system applications.
- Extensive experience in EnterpriseDataWarehousing,DataIntegration andData/CodeMigration.
- Expertise in Extraction, Transformation & Loading of data using heterogeneous sources and targets.
- Expertise in creating very detailed design documents and performing Proof of Concepts (POC).
- Experience with good understanding of the concepts and handling Repository Manager, Designer and Informatica Admin Console.
- Worked on standardpythonpackages like boto and boto3 forAWS.
- Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
- Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2/Type3) loads.
- Extensively created mapplets, common functions, reusable transformations, look - ups for better usability.
- Good understanding of relational database management systems like Oracle, Teradata, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Expert in Oracle 11g/10g, IBM DB2 8.1, Sybase, SQL Server 2008/2005, SQL, PL/SQL Stored procedures, functions, and exception handling using Toad and PLSQL.
- Worked onPerformance Tuning,identifyingandresolving performance bottlenecksin various levels like sources, targets, mappings and sessions.
- Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling using 3NormalForm (3NF) and Multidimensional Data Modeling Schema for various environments and business processes.
- Developed, deployed and monitoredSSISPackages for new ETL Processes and upgraded the existing DTS packages toSSISfor the on-going ETL Processes.
- Exposure in develop the ETL packages usingSSIS.
- Expertise in Verification and Validation tools for coding used inPython. in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems using ERWIN and Power Designer.
- CreatedGoldenGatereplicats to replicate data from transactional systems on to Data Warehouse staging area
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
- Experience working in agile methodology and ability to manage change effectively.
- Responsible for Team Delivery and Participated in Design Reviews.
- Very good hands on experience ofMDMdevelopment.
- Experience in Cognos Report Studio to design Reports based on the requirements by the end user.
- Excellent communication, interpersonal skills and quickly assimilate latest technologies concepts and ideas
- Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
TECHNICAL SKILLS
ETL Tools: Informatica 9.6/9.5/8.6.1/8.1//7.1.2 (Power Center), IDQ, MDM,SQL Server SSIS.
Databases: Teradata V2R12, Oracle 11g/10g/9i/8i/, MS SQL Server 2012/2008/2005 , DB2 UDB, MS Access 2000, Sybase
Modeling Tools: Erwin 9.x, Rational Rose, ER/Studio, MS Visio, Power designer.
Others: Toad, SQL Navigator, Cognos Report Studio, Teradata SQL Assistant.
Project Execution Methodologies: Ralph Kimball Methodology, Bill Inman Methodology, Star, Snow-Flake, Fact Tables, Dimension Tables, Rapid Application Development (RAD), Joint Application Development (JAD).
Programing Languages: SQL, PL/SQL,PERL, UNIX shell Scripting.
Job Scheduling: Autosys, Shell Scripting
PROFESSIONAL EXPERIENCE
Confidential, Wilmington, DE
Sr. ETL/Informatica Developer
Responsibilities:
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Responsible for Data Warehouse Architecture, ETL and coding standards.
- Developed Capacity Planning/Architecture/ Strategic Roadmaps/Implementing standards.
- Responsible for creating theSSISPackages to implement the logic as per the business requirement.
- Used severalSSISComponents like SCD, LOOKUP, MERGE JOIN Transformations and several Tasks like Execute sql task and Data flow task, file system tasks to implement the logic for business requirement.
- Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
- Extracted data from SAP R/3 and loaded into Oracle Data Warehouse.
- Used Transformations like Look up, R uter, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Mapplets and used them in different Mappings.
- Expert level proficiency in working withETLtools used at Overstock such asGoldenGate, Oracle Data Integrator, DMExpress and Control-M.
- Created parameter files and replicats inGoldenGatereplication tool to source real-time transactional data for operational reporting
- Worked on data cleansing and standardization using the cleanse functions inInformaticaMDM.
- Involved inmatch/merge and match rules to check the effectiveness ofMDMprocess on data.
- Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
- Done extensive bulk loading into the target using Oracle SQL Loader.
- Designed and developed Mappings for loading MDM HUB
- Ensured MDM code conforms to established coding standards and meets the feature specification.
- Did Application tuning and Disk I/O tuning to enhance the performance of the system.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Documented Cleansing Rules discovered from data cleansing and profiling.
- Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
- Experienced in Debugging and Performance tuning of targets, sources, mappings and sessions.
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
- Delivered all the projects/assignments within specified timelines.
Environment: Informatica Power Center 9.6, Power Exchange, Oracle 11g, SAP R/3, Flat files, MS SQL server 2008, DB2 8.0,SSIS, Erwin, Winscp, Control-M, MS. Visio, MDM, Mercury Quality Center, Golden Gate Shell Script, UNIX.
Confidential, Sacramento, CA
ETL Developer
Responsibilities:
- Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents
- Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code
- Handled technical and functional call across the teams.
- Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards implementation.
- FollowedAWSbest practices to convert data types from oracle to REDSHIFT
- Created database objects inAWSREDSHIFT.
- Experience with developing and maintaining Applications written for Amazon Simple Storage, AWSElastic Map Reduce, andAWSCloud Formation
- Executed multiple trialdataconversions and Go Livedataloads prior to successfulmigrationof productiondatafrom legacydatasources to the target application.
- Worked extensively with importing metadata into Hive usingPythonand migrated existing tables and applications to work onAWScloud (S3).
- Analyzed and created a high level and low-level technical design fordatamigrationthrough Business ObjectsDataservices.
- Involved in the designing of Landing, Staging and Base tables in Informatica MDM.
- Using Entity 360 we viewed wide range of data related to an entity.
- Responsible for offshore Code delivery and review process
- Used Informatica to extract data from DB2, HL7, XML and Flat files to load the data into the Teradata
- Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
- Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor.
- Worked on test version of Informatica Data Quality (IDQ) to verify the accuracy of data on Addresses and Contacts.
- Involved in Design Review, code review, test review, and gave valuable suggestions.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Worked on Cognos Report studio to generate Complex Reports based on the requirements.
- Developed the complex reports usingCognosreport studio like summary reports, detail reports, Drill through reports and Drill down reports.
- UsedCognosConnection to administer and schedule reports to run at various intervals of time.
- Created partitions for parallel processing of data and also worked with DBAs to enhance the data load during production.
- Performance tuned informatica session, for large data files by increasing block size, data cache size, and target based commit
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2 .
- Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload
- Used the Teradata fast load/Multiload utilities to load data into tables
- PerformDataintegrity, validation and testing on thedatamigrated into thedatawarehouse.
- CreatedMDMmapping and configured match and merge rules to integrate the data received from different sources.
- Took part in migration of jobs from UIT to SIT and to UAT
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions
- Involved in Informatica Code Migration across various Environments.
Environment: Informatica Power Center 9.5, Oracle 11g, Teradata, Fast load, Cognos Report Studio, Multiload, Teradata SQL Assistant, MS SQL Server 2012, TOAD, Erwin, AIX, Shell Scripts, Autosys, Informatica Data Quality (IDQ), Unix.
Confidential, Raleigh, NC
Informatica Developer/Administrator
Responsibilities:
- Prepared the required application design documents based on functionality required
- Designed the ETL processes using Informatica to load data from Oracle, DB2 and Flat Files to staging database and from staging to the target database.
- Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
- Involved in migration of mappings and sessions from development repository to production repository
- Extensively used Informatica and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
- Involved in cleansing and extraction of data and defined quality process for the warehouse.
- Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.
- Design and configuration ofInformaticaMDMacross master data-sets.
- Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
- Translated Business specifications into PL/SQL code. Extensively developed and fine-tuned Oracle Stored Procedures and triggers.
- Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.
- Defects are logged and change requests are submitted using defects module of Test Director using HP Quality Center
- Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Involved in migrating objects from DEV to QA and testing them and then promoting to Production
- Involved in production support working with various tickets created while the users working to retrieve the database.
Environment: Informatica Power Center 8.6.1, Business Objects, Oracle 10g, TOAD, Erwin, SQL, PL/SQL, XML, HP UNIX, Tableau9.3/10.1, Test Director/Quality Center
Confidential
Data Modeler/Data Analyst
Responsibilities:
- Worked as a Data Modeler/Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
- Developed logical data models and physical database design and generated database schemas using Erwin.
- Worked with DBA to create Best-Fit PhysicalDataModel from the logical Data Model using Forward Engineering in Erwin
- Performed Data analysis of existing data base to understand the data flow and business rules applied to Different data bases using SQL.
- Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NFmodel
- Performed data analysis and dataprofiling using complex SQL on various sources systems including Oracle, SQL Server and DB2.
- Defined ETL framework which includes load pattern for staging and ODS layer using ETL tool, file archival process, data purging process, batch execution process
- Designed Mapping Documents and Mapping Templates for the ETL team.
- Involved in data profiling and data cleansing to eliminate Data Redundancy and enhance performance of the database.
- Involved inDataWarehouse Support - Star Schema and Dimensional modeling to help designdata marts anddatawarehouse.
- Involved in using "Complete-Compare" functionality to support iterative development by keeping the models and databases in synch.
- Have coordinated multiple small and large Data Integration, Data warehousing and Oracle development projects.
- Performed Data Mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data.
Environment: Oracle 12c/11g, Windows 7, PL/SQL, DB2, MS-Access, SQL SERVER, MS Office, MS Visio, Erwin 4.0/7.0, Teradata 14.10
Confidential
Data Analyst
Responsibilities:
- Analyzed Data sources and requirements and business rules to perform logical and physical Data Modeling
- Played an active role as a member of Project team to provide business data requirements analysis services producing conceptual and logical data models.
- Designed the Data Marts using Ralf Kimball's Dimension DataMart modeling methodologies using ERWIN.
- Worked with the DBA to convert logical Data models to physical Data models for implementation.
- Extensively used Normalization techniques (upto3NF).
- Created database objects like Tables, Stored Procedures, Indexes, Sequences, Views, Rules etc.
- Developed SQL Queries to fetch complexdatafrom different tables in remote databases using joins, database links and Bulk collects.
- Participated in JAD sessions to resolve critical issues.
- Created aDataMapping document after each assignment and wrote the transformation rules for each field as applicable.
Environment: Erwin 4.0, PL/SQL, MS Visio, MS Excel, Oracle 9i.