Sr Etl/mdm Developer Resume
- Over 10 Years of progressive Information Technology experience with extensive Business Intelligence domain experience.
- Proficient in ETL - Informatica 8.x/9.x/10.2, Informatica IDQ, Informatica MDM, Reporting - SAP BO,OBIEE Database - Teradata, Oracle, SQL Server and large Data Warehouse applications.
- With Strong Knowledge in all phases of Software Development Life Cycle (SDLC) such as requirement gathering, data analysis, design, development, testing, deployment and Production support of complex Data warehousing & B.me solutions.
- Reports, Testing, Migration, Administration, security management and production support.
- Highly proficient in Extract, Transform and Load the data into target systems using Informatica power center Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Extensive experience in Extraction, Transformation and Loading of data directly from different heterogeneous source systems like flat files, .xml, backend db’s, Oracle, SQL Server, Teradata.
- Developed mappings using Informatica transformations (me.e. Normalizer, Aggregator, Java Transformation, Expression, Lookup, Joiner, filter, Router, sorter etc.)
- Informatica ETL connectivity across different database and flat file.
- Experience with InformaticaPowerExchange(9.x) for Loading/Retrieving data from mainframe systems.
- Good experience in Data Modeling with expertise in creating Star & Snowflake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin and Visio.
- Experience with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.x
- Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.
- Experience in designing E-R diagrams, Logical and Physical database designs using Erwin and Visio.
- Business Objects reporting functionalities such as Master Detailed, Slice and Dice, Drilling methodology, Filters, Ranking, Sections, Graphs and Breaks. Ability to create Contexts, Alias, and derived tables where necessaryusing BO.
- Develop high performance semantic / structural layer for reporting and analytics
- Integrating data sources from multiple channels and preparing the layouts for reports and dashboards
- Experience in writing functional & technical documents
- Knowledge on Teradata Architecture and worked with utilities(fastload, fastexport, bteq etc).
- Experienced in UNIX shell scripts to write wrapper scripts to call ETL Jobs.
- Excellent communication and interpersonal skills working within multi-diverse teams Strong analytical and communication skills and an eagerness to learn new technologies.
- Managing / Leading teams and targets as and when assigned.
- Tracking the resource allocation of the team towards a project and handling the resource management towards the other projects
- Preparing shift schedules of the team to ensure capacity availability.
Operating System: Windows XP/2000/98, Vista, UNIX, Red Hat LINUX
Database: Oracle DB 9i, 10g,11g, MS SQL Server 2008, 2012, Teradata 13.10/14.0
Dimensional Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling
Tools: INFORMATICA 10.2/9.x/8.x, Cloud, IDQ, MDM, PWX, SAP Business Objects 3.1/4.0,OBIEE 11g, IBM Tivoli, Control M, SVN Repo browser, Autosys
Sr ETL/MDM Developer
- Analyzed and document business requirements.
- Created high level design, technical design documents and mapping documents based on business requirements to consolidate bankruptcy information.
- Parsed and staged data sent using MISMO XML files from WKFS, which is used for HMDA reporting.
- Performed extensive data analysis using Data Quality to design interfaces as required.
- Designed and developed various reports dat are sent to NBS(National Bankruptcy Services) on a daily basis.
- Involved in implementing the Land Process of loading the Borrower/Loan data set into Informatica MDM Hub using ETL tool dat was coming from various source systems.
- Defined the Base objects, Staging tables, foreign key relationships, lookups, queries, packages and query groups.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
- Configured match rule set property by enabling search by rules in MDM according to Business Rules.
- Designed complex mappings to stage data and create extracts from complex normalized landing zone tables as per requirement.
- Designed applications and interfaces to update Writeback transactions to RMS(Recovery management system) systems.
- Created Slowly Changing Dimension (SCD) type1, type2, Slowly Growing Targets, and Simple Pass Through mappings to meet business requirement.
- Optimized the complex ETL process using Performance Tuning Methods.
- Written complex SQL query as per business requirement to fetch the data from relational tables and helped BA’s
- Developed Autosys scripts to run the Informatica workflows and SFTP source/target files from various systems
Environment: Informatica 10.2/9.6.x, IDQ 9.6, MDM 9.6, PWX 9.x, SQL Server 2012, Autosys and Windows NT.
Lead ETL Consultant
- Involved in gathering and analyzing the requirements and preparing business rules.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, DB2, SQL server and flat files, loaded into Oracle and created compliance reports for AETNA,PRESS GANEY.
- Developed common Stored procedures and shell scripts dat are used across all applications to audit record counts.
- Involved in creating new table structures and modifying existing tables to fit into the existing Data Model.
- Extracted data from different databases like Oracle and prepared compliance extracts using ETL tool.
- Developed applications to process the feedback files in xml format and load them to DWH.
- Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
- Created stored procedures and packages in PL/SQL for performing various tasks.
- Developed reports for billed, unbilled visits, payments using OBIEE.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements
- Involved in Performance Tuning of mappings in Informatica.
Environment: Informatica 9.5.1, Oracle 11g, OBIEE, SVN Repo, Control-M and Unix.
- Participated and reviewed Business & Functional requirements for all the B.me projects especially focusing on ETL processes, Reporting and dashboarding aspects
- Partnered with Architecture team in understanding and providing inputs with respect to solution architecture.
- Created Detailed Design Documents based on the recommended architecture and business requirements.
- Created Source to Target mappings and various aspects of data access methods based on the capabilities of source and targets using transformations like Expression, Lookup, Update Strategy extensively.
- Created shell scripts dat are used for preprocessing and postprocessing of flat files.
- Created teradata scripts to load files to the database dat doesn’t need any additional ETL processing.
- Worked on Scheduling workflows and sessions using Informatica scheduler and have implemented decision task and email task for tracking the successful completion
- Identify source systems, connectivity, tables, and fields to ensure data suitability for mapping
- Created flexible mappings/sessions using parameters, variables and heavily using parameter files
- Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions Performance tuning of the mapping where goal was to insert thousands of rows to target by identifying bottleneck at target, source, mapping and session level.
- Ensuring the testing and development cycles of the reports are thoroughly met as per the sign off documents.
- Provided warranty support and production support for ETL applications post successful production migration.
Environment: Informatica 8.6.0/9.1, SAP BO, Oracle 10g, Teradata SQL Assistant 13/14, Unix and Tivoli
- Involved in requirement Gathering and analysis of source data as data comes in from different Source systems.
- Record Count Verification DWH backend/Reporting queries against source and target as an initial check.
- Data integrity between the various source tables and relationships.
- Check for missing data, negatives and consistency. Field-by-Field data verification was done to check the consistency of source and target data.
- Administered Impact Analysis, fine tune Informatica mappings to implement enhancements.
- Addressed production issues and code defects tracked in QTP.
- Coordinated with downstream consumer for end to end testing.
- Created extensive documentation for process flow and reuse of shared code.
Environment: Informatica 8.6.1, Oracle 9i, SQL Server 2008, Unix and IBM Tivoli