Data Engineer/etl Developer Resume
Denver, CO
SUMMARY
- 7+ years of experience in Information Technology wif a strong background in Database development and Data warehousing.
- Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Developer (IDQ), Informatica Power Exchange, Informatica Intelligent Cloud Services (IICS).
- Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
- Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
- Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
- Experience working wif Informatica Data Quality tool to TEMPeffectively use Address validator transformation for Address doctor implementation.
- Worked in different phases of teh projects involving Requirements Gathering and Analysis, Design, Development,, Testing, Deployment and Support.
- Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
- Worked wif wide variety of sources like Relational Databases, Flat Files, XML files, Mainframes, Salesforce, Unstructured data files and Scheduling tools like CA7, Control-M and Informatica Scheduler.
- Experience handling Health care standard unstructured files like HL-7, EDI's 834, 835, 837's, NCPDP Files.
- Experience working wif various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
- Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout teh Software Development life cycle (SDLC).
- Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes),UNIX Shell scripting and Windows Batch scripting.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
- Experience working in Production Support, Support for Emergency fix and migrating fix from lower environment to higher environment as per teh policies.
- Very good exposure to Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005, Azure Sql Datawarehouse, Hive, Teradata v15/13.
- Excellent Verbal and Written Communication Skills. Has proven to be highly TEMPeffective in interfacing across business and technical groups.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/9.1/8.6. x, Informatica Power Exchange 9.5.1, Informatica developer 9.6.1, Informatica Intelligent Cloud Services (IICS)
Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.
Databases: Oracle11g/10g/9i, SQL Server 2012/2008/2005, Azure Sql Datawarehouse, Hive, Teradata V15/13.
Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, BMC- Control M.
Reporting Tools: Crystal Reports, Microstratergy, Hyperion Essbase.
Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Shell, Perl, Batch
Operating Systems: Windows 7/XP/NT/2000, DOS, UNIX and LINUX
Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office.
PROFESSIONAL EXPERIENCE
Confidential, Denver, CO
Data Engineer/ETL developer
Responsibilities:
- Worked wif Business analysts and teh DBA for requirements gathering, business analysis and designing of teh data mart for an application portal which is primary source for ETL feed in dis project.
- Involved in creating Logical and Physical Data Models and creating models by defining integrity between tables.
- Worked on enhancing existing workflows to add historical functionality.
- Working closely wif BA s to draft teh requirement from source to target as per teh business requirements and creating standard documents for design, review and development.
- Involved in complex development wif agile timelines for business delivery.
- Created parameter files in Visual Studious
- Created mappings involving concept of full/incremental loads, type 1 and type 2 loads.
- Worked on Informatica mapping to create XML file in order to load it into Eagle Datawarehouse.
- Developed interfaces which feeds data to/from SAP fico to track teh financial activities. Development demanded loads wif fixed width file/XML file format.
- Developed interfaces to report teh data to/from DWH which involved development for constraint-based loading, dimension and fact loads.
- TEMPEffectively used all teh standard transformations and advanced transformations like Java transformation, Address validator transformation, transaction control transformation, Sql transformation, and dynamic lookup transformation.
- Extracted/loaded data from/into diverse source/target systems like SQL Server, XML and Flat Files
- Created Mapplets/Worklets and used them in different Mappings/Workflows.
- Created command tasks to automate scripting, Decision task, File Watcher in teh workflows.
- Implementation of generic audit mappings to TEMPeffectively use it wif each load.
- Developed perl script to call teh workflows in all teh environments rather TEMPthan manually triggering teh job.
- Experience working wif putty terminal to Create. Edit, Move parameter files/source files/lookup files and target files.
- Experience working on identifying bottleneck, performance tuning for better customer delivery.
- Data analysis/Code validation in production environment to ensure any future failures/anomalies.
- Creating Deployment group for Informatica code migration to higher environments.
- Created Schema objects like Indexes, Views, Sequences, Stored Procedures and Constraints
- Daily production supports for teh scheduled job and provide teh resolution related to failed job.
- Worked on batch processing and file transfer protocols using scripting techniques.
- Analyzed teh enhancements coming from teh individual Client for application and implemented teh same.
- Automating Informatica Jobs in Production using TWS.
- Resolving any defects, issues by tracking it in JIRA.
- Working wif database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into teh testing process.
Confidential, Albany, NY
Sr. ETL Informatica Developer
Responsibilities:
- Study and analyze mapping document, teh required source tables, data types, required transformations based on Business requirements and Technical specifications.
- Worked wif Business analysts and teh DBA for requirements gathering, business analysis and designing of teh data mart for an application portal which is primary source for ETL feed in dis project.
- Involved in creating Logical and Physical Data Models and creating models by defining integrity between tables.
- Working closely wif BA s to draft teh requirement from source to target as per teh business requirements and creating standard documents for design, review and development.
- Worked on High level Design and Low-Level Design documents for teh Project for Medical Data Warehousing
- Worked on migrating teh workflows from PC to IICS.
- Built POC and framework to implement teh upcoming projects into IICS
- Worked on teh Cloud projects to generate teh files and dump in Azure Data lake.
- Developed system design workflow using Visio wif Data Architect.
- Worked on Release Management for teh weekly release wif DBA and Data Architect.
- Worked on Informatica Code Migration and Change Request Management
- Involved in complex development wif agile timelines for business delivery.
- Created mappings involving concept of full/incremental loads, type 1 and type 2 loads.
- Developed demanded loads wif fixed width file/XML file format.
- Developed interfaces to report teh data to/from DWH which involved development for constraint-based loading, dimension and fact loads.
- TEMPEffectively used all teh standard transformations and advanced transformations like Java transformation, Address validator transformation, transaction control transformation, Sql transformation, and dynamic lookup transformation.
- Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
- Created Mapplets/Worklets and used them in different Mappings/Workflows.
- Created command tasks to automate scripting, Decision task, File Watcher in teh workflows.
- Implementation of generic audit mappings to TEMPeffectively use it wif each load.
- Developed perl script to call teh workflows in all teh environments rather TEMPthan manually triggering teh job.
- Experience working wif putty terminal to Create. Edit, Move parameter files/source files/lookup files and target files.
- Experience working on identifying bottleneck, performance tuning for better customer delivery.
- Data analysis/Code validation in production environment to ensure any future failures/anomalies.
- Creating Deployment group for Informatica code migration to higher environments.
- Created Schema objects like Indexes, Views, Sequences, Stored Procedures and Constraints
- Daily production supports for teh scheduled job and provide teh resolution related to failed job.
- Worked on batch processing and file transfer protocols using scripting techniques.
- Created various tasks, arranged related tasks in different worklets, arranged different worklets in workflow depending upon their interdependencies in workflow manager.
- Analyzed teh enhancements coming from teh individual Client for application and implemented teh same.
- Resolving any defects, issues by tracking it in JIRA.
- Working wif database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into teh testing process.
Confidential, MI
Sr. ETL Informatica Developer
Responsibilities:
- Worked wif Business analysts and teh DBA for requirements gathering, business analysis and designing of teh data mart for an application portal which is primary source for ETL feed in dis project.
- Involved in creating Logical and Physical Data Models and creating models by defining integrity between tables.
- Working closely wif BA s to draft teh requirement from source to target as per teh business requirements and creating standard documents for design, review and development.
- Involved in complex development wif agile timelines for business delivery.
- Created mappings involving concept of full/incremental loads, type 1 and type 2 loads.
- Developed interfaces which feeds data to/from SAP fico to track teh financial activities. Development demanded loads wif fixed width file/XML file format.
- Developed interfaces to report teh data to/from DWH which involved development for constraint based loading, dimension and fact loads.
- TEMPEffectively used all teh standard transformations and advanced transformations like Java transformation, Address validator transformation, transaction control transformation, Sql transformation, and dynamic lookup transformation.
- Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
- Created Mapplets/Worklets and used them in different Mappings/Workflows.
- Created command tasks to automate scripting, Decision task, File Watcher in teh workflows.
- Implementation of generic audit mappings to TEMPeffectively use it wif each load.
- Developed perl script to call teh workflows in all teh environments rather TEMPthan manually triggering teh job.
- Experience working wif putty terminal to Create. Edit, Move parameter files/source files/lookup files and target files.
- Experience working on identifying bottleneck, performance tuning for better customer delivery.
- Data analysis/Code validation in production environment to ensure any future failures/anomalies.
- Creating Deployment group for Informatica code migration to higher environments.
- Created Schema objects like Indexes, Views, Sequences, Stored Procedures and Constraints
- Daily production supports for teh scheduled job and provide teh resolution related to failed job.
- Worked on batch processing and file transfer protocols using scripting techniques.
- Analyzed teh enhancements coming from teh individual Client for application and implemented teh same.
- Automating Informatica Jobs in Production using TWS.
- Resolving any defects, issues by tracking it in JIRA.
- Working wif database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into teh testing process.
Confidential, Mather CA
Sr. ETL Informatica Developer
Responsibilities:
- Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
- Experience in creating high level documents, Source to Target mapping document and detailed design level document for teh entire ETL process.
- Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
- Involved application source like EPIC, Clarity, Facets and unstructured files.
- Project involved usage of most of teh transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
- Extensive implementation of Incremental/Delta loads wif teh halp of various concepts like mapping variable, mapping parameter and parameter table concept.
- Creating ETL Code in such a way to support implementation of full loads for teh initial run and incremental/delta loads for next daily runs
- Developed mappings to load data into landing layer, staging layer and publish layer wif extensive usage of SCD Type I and SCD Type II development concept.
- Experience development of SCD Type I and Type II wif teh halp of MD5 hash function.
- Experience working wif B2B data transformation concept of Informatica.
- Experience working wif advanced Informatica transformation like unstructured data transformation for parsing HL7 data file.
- Experience working wif EDI's 834, 835 and 837 relating to member and payment information between provider and payee.
- Experience working in Informatica Data Quality to create a mapplet for validating, cleasing address s using Address Validator transformation.
- Exporting teh Mapplets from IDQ into Informatica Powercenter to use teh mapplet in various mappings for implementation of Address doctor.
- Hands on experience working on profiling data using IDQ.
- Experience working wif extracting and loading data directly into Salesforce objects using Informatica Powercenter.
- Experience working wif various session properties to extract data from Salesforce object using standard api, Bulk api.
- Creating new and enhancing teh existing stored procedure SQL used for semantic views and load procedures for materialized views
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Worked in teh Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
- Tuned Informatica mappings to improve teh execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside teh mapping.
Confidential, Cleveland, OH
ETL Technical Onsite Lead
Responsibilities:
- Analyze teh business requirements and framing teh Business Logic for teh ETL Process and maintained teh ETL process using Informatica Power Center.
- Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
- Develop an ETL Informatica mappings in order to load data into staging area. Extracted data from Mainframe files, flat files, Sql Server and loaded into Oracle 11g target database.
- Create workflows and work lets for Informatica Mappings.
- Write Stored Procedures and Functions to do Data Transformations and integrate them wif Informatica programs and teh existing applications.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for teh validity of teh data from different data sources.
- Developed workflows for dimension loads, fact loads based on daily/monthly runs.
- Developed code to archive monthly data into history tables and TEMPeffective use of teh history table to load teh data back into teh system for a particular past month.
- Developed Audit tables to keep teh track of ETL Metrics for each individual run.
- Experience working wif Audit Balance control concept to create teh parameter files dynamically for each workflow before its run.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing wif partitioned tables and automating teh process of partition drop and create in oracle database.
- Involve in migrating teh ETL application from development environment to testing environment.
- Perform data validation in teh target tables using complex SQLs to make sure all teh modules are integrated correctly.
- Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of teh transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
- Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
- Involve in performance tuning for better data migration process.
- Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Automated teh jobs using CA7 Scheduler.
- Worked on Direct Connect NDM process to transfer teh files between servers.
- Document and present teh production/support documents for teh components developed, when handing-over teh application to teh production support team.