We provide IT Staff Augmentation Services!

Etl Informatics Developer Resume

0/5 (Submit Your Rating)

Livonia, MI

SUMMARY

  • 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing.
  • Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Developer (IDQ), Informatica Power Exchange.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
  • Experience working with Informatica Data Quality tool to effectively use Address validator transformation for Address doctor implementation.
  • Worked in different phases of the projects involving Requirements Gathering and Analysis, Design, Development,, Testing, Deployment and Support.
  • Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
  • Scheduling tools like Control-M and Informatica Scheduler.
  • Experience working with various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
  • Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
  • Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes),UNIX Shell scripting and Windows Batch scripting.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Experience working in Production Support, Support for Emergency fix and migrating fix from lower environment to higher environment as per the policies.
  • Very good exposure to Oracle 11g, MS SQL Server 2012/2008, Teradata v13.
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.6.1/9.5.1 , Informatica Power Exchange 9.5.1, Informatica developer 9.6.1

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server 2012/2008, Teradata v13

Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, BMC- Control M.

Reporting Tools: Crystal Reports, Microstratergy, Hyperion Essbase.

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Shell, Batch

Operating Systems: Windows 7/XP/NT/2000, DOS, UNIX and LINUX

Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office.

PROFESSIONAL EXPERIENCE

Confidential, Livonia MI

ETL Informatics Developer

Environment: Informatica Powercenter 10.1/9.6.1, Oracle 11g, Sql Server 2012, Teradata v13, flat files, XML files, Informatica IDQ, B2B data transformation, Informatica ICloud, Informatica Address Doctor, Batch Scripting, Microstratergy.

Responsibilities:

  • Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
  • Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
  • Set up repeating groups in parser to effectively parse the EDI standard files.
  • Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
  • Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.
  • Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
  • Developed mappings to load data into landing layer, staging layer and publish layer with extensive usage of SCD Type I and SCD Type II development concept.
  • Experience development of SCD Type I and Type II with the help of MD5 hash function.
  • Experience working with B2B data transformation concept of Informatica.
  • Experience working with advanced Informatica transformation like unstructured data transformation for parsing HL7, EDI data file.
  • Experience working in Informatica Data Quality to create a mapplet for validating, cleasing address’s using Address Validator transformation.
  • Exporting the Mapplets from IDQ into Informatica Powercenter to use the mapplet in various mappings for implementation of Address doctor.
  • Hands on experience working on profiling data using IDQ.
  • Experience working with Teradata utilities using BTEQ scripts, Fast Load, Multi Load, Fast export, Tpump.
  • Experience working with TPT connection in informatica. Handling Log files and error files while Teradata loads.
  • Experience working with extracting and loading data directly into Salesforce objects using Informatica Powercenter.
  • Experience working with various session properties to extract data from Salesforce object using standard api, Bulk api.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
  • Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.

Confidential

ETL Informatica Developer

Environment: Informatica Power Center 9.1/9.6.1, Power Exchange 9.1, PL/SQL, Oracle 10g, SQL Server 2008, Windows Server 2003, UNIX, Tivoli Scheduler, Toad

Responsibilities:

  • Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
  • Involved in creating Logical and Physical Data Models and creating Star Schema models.
  • Assisted in designing Star Schema to design dimension and fact tables.
  • Involved in developing star schema model for target database using ERWIN Data modeling.
  • Worked on Informatica PowerCenter - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Extensively involved working on different domain like Member, Claim, Pharmacy and Product
  • Involved in processing files like 835, 837, HL7, NCPDP files
  • Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
  • Created Mapplet and used them in different Mappings.
  • Used sorter transformation and newly changed dynamic lookup.
  • Created events and tasks in the work flows using workflow manager.
  • Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
  • Created Schema objects like Indexes, Views and Sequences.
  • Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
  • Created Stored Procedures, Functions and Indexes Using Oracle.
  • Worked on batch processing and file transfer protocols.
  • Performance tuning of the workflow which are taking time.
  • Analyzed the enhancements coming from the individual Client for application and implemented the same.
  • Creation of technical documents.
  • Developed mappings for policy, claims dimension tables.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.

Confidential, Chicago, IL

ETL Informatica Developer

Environment: Informatica 8.6.1, Oracle 11g, Sql Server 2008, Power Exchange 8.6.1, flat files, Mainframe, Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX, Perl Scripting and CA7 Scheduler

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
  • Develop an ETL Informatica mappings in order to load data into staging area. Extracted data from Mainframe files, flat files, Sql Server and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Write Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Developed workflows for dimension loads, fact loads based on daily/monthly runs.
  • Developed code to archive monthly data into history tables and effective use of the history table to load the data back into the system for a particular past month.
  • Developed Audit tables to keep the track of ETL Metrics for each individual run.
  • Experience working with Audit Balance control concept to create the parameter files dynamically for each workflow before its run.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect NDM process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.

Confidential, Madison, WI

Informatica Developer

Environment: Informatica Powercenter 8.6.1, Oracle 9i, fixed width & delimited flat files, XML files, Unix Shell Scripting, Perl Scripting, SAP FICO, TWS(Tivoli), Jira, Putty

Responsibilities:

  • Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data mart for an application portal which is primary source for ETL feed in this project.
  • Involved in creating Logical and Physical Data Models and creating models by defining integrity between tables.
  • Working closely with BA’s to draft the requirement from source to target as per the business requirements and creating standard documents for design, review and development.
  • Involved in complex development with agile timelines for business delivery.
  • Created mappings involving concept of full/incremental loads, type 1 and type 2 loads.
  • Developed interfaces which feeds data to/from SAP fico to track the financial activities. Development demanded loads with fixed width file/XML file format.
  • Developed interfaces to report the data to/from DWH which involved development for constraint based loading, dimension and fact loads.
  • Effectively used all the standard transformations and advanced transformations like Java transformation, Address validator transformation, transaction control transformation, Sql transformation, and dynamic lookup transformation.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
  • Created Mapplets/Worklets and used them in different Mappings/Workflows.
  • Created command tasks to automate scripting, Decision task, File Watcher in the workflows.
  • Implementation of generic audit mappings to effectively use it with each load.
  • Developed perl script to call the workflows in all the environments rather than manually triggering the job.
  • Experience working with putty terminal to Create. Edit, Move parameter files/source files/lookup files and target files.
  • Experience working on identifying bottleneck, performance tuning for better customer delivery.
  • Data analysis/Code validation in production environment to ensure any future failures/anomalies.
  • Creating Deployment group for Informatica code migration to higher environments.
  • Created Schema objects like Indexes, Views, Sequences, Stored Procedures and Constraints
  • Daily production supports for the scheduled job and provide the resolution related to failed job.
  • Worked on batch processing and file transfer protocols using scripting techniques.
  • Analyzed the enhancements coming from the individual Client for application and implemented the same.
  • Automating Informatica Jobs in Production using TWS.
  • Resolving any defects, issues by tracking it in JIRA.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.

We'd love your feedback!