We provide IT Staff Augmentation Services!

Etl Informati Developer - Business Process Integration Resume

3.00/5 (Submit Your Rating)

CA

SUMMARY:

  • 8+ Years of experience in IT industry including expertise in design and development of Relational Database Systems, Design and Development of Data Warehousing using Informatica Power Center ETL tool.
  • Extensively worked on data warehousing concepts using various ETL tools like Informatica and Oracle Warehouse Builder.
  • Experience in ETL tool, INFORMATICA (V 8.x & V 9.x). Experience in Using Repository Manager, Designer and Workflow Manger, Workflow Monitor Tools in Informatica Power Center.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Proficient in ETL (Extract - Transform - Load) using SQL Server Integration Services (SSIS) and Informatica Power Center tool.
  • Strong SSRS development skills, experience including tablix, Matrix, chart, graphic score card reports, sub-reports.
  • SAS ETL developer with expertise in design and development of Extract, Transform and Load processes for data integration projects to build data marts.
  • Extensive understanding of Transactional and Dimensional Data Modeling, Data Ware house concepts and Designing Star schemas, Kimball & Snowflake Schemas for OLAP Systems.
  • Developed Slowly Changing Dimension mappings of Type1, Type2 and Type3 (version, flag and time).
  • Strong Understanding of the FACETS front-end application.
  • Development and deploying of the SSIS 2005/2008 packages to maintain update and migrate the data into and from the FACETS application database.
  • Experience in Trouble shooting and implementing Performance tuning Confidential various levels such as Source, Target, Mapping, Session and System in ETL Process.
  • Experienced in CRM Sales Force.co(SFDC) and Professional Services Applications (PSA) Net suite/Open air
  • Expertise on Apex in creating Classes SOQL,SOSL, Controllers, Triggers, Web Services, Visual Force Pages, Components, Custom Objects, S-Controls
  • Strong hands on writing WSDL, SOAP (web services) that support the Big Machine BMI/SFDC integration.
  • Experience in data profiling & data quality rules development using Informatica Data Quality tools.
  • Extensively used Netezza Utilities to load and execute sql scripts using Unix.
  • Extensive experience in EDI ASC X12 formats, ICD-9 to ICD-10 and FACETS in health care domain.
  • Hands-on experience on Mulesoft cloud integration & Mule Enterprise Service Bus (EBS) for data integration.
  • Strong experience in developing complex mappings using transformations like Unconnected and Connected Lookups, Router, Aggregator, Sorter, Joiner, Transaction Control, Filter and Update Strategy.
  • In depth knowledge of SDLC: Rational Unified Process (RUP), Agile, Waterfall and Spiral methodologies
  • Designed Erwin logical and physical data model of the data ware house.
  • Expertise in data extraction from Flat Files (XML, excel, csv, fixed width, delimited, txt), and Oracle server.
  • Played an integral part in the building of a multi-server, multi-database enterprise Data warehouse using Data Stage ETL (extract, transform and load) tools and SQL Server to load legacy business data.
  • Good experience in handling terabytes of data and good Confidential analyzing the data related issues.
  • Extensively worked on oracle warehouse builder tool. Supported customizations and testing.
  • Created various tasks like sessions, and workflows in the workflow manager to test the mapping during development.
  • Hands on experience on SQL and PL SQL concepts. Working experience on oracle server.
  • Extensively worked on performance tuning areas.
  • Good knowledge on partitions, indexes, table space related issues.
  • Lead Experience with offshore team having four members to monitor regular tasks.
  • Good working experience on partition exchange load process (offline data loads) in oracle.
  • Good in providing estimations on hardware sizing for the database servers.
  • Good in suggesting the performance tuning techniques to increase the data load through put.
  • Implement UNIX shell scripts for scheduling various tasks.
  • Strong Knowledge in ETL design document and mapping sheet creation.
  • Good in scheduling jobs and administration activities.
  • Proficient in understanding Functional Requirements and Design documents.
  • Extensive expertise in designing, developing and executing test scenarios and test cases.
  • Possess good communication skills and effective client communication.
  • Strong problem solving & technical skills coupled with confident decision making for enabling effective solutionsleading to high customer satisfaction.
  • Having good experience as team lead and led the team with team size 3-5.
  • Self-learner and highly self-motivated attitude.
  • Knowledge of Change Data Capture (CDC) using Oracle 9i and Informatica Power Exchange.
  • Followed effectively industry standards of HIPAA, ANSI837 and PHI concepts.

TECHNICAL SKILLS:

ETL/DW Tools: Informatica Power Center 10/9.5.1/9.0/8.6.1 (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor) Informatica Power Exchange, Data warehouse Builder, Informatica Cloud.

DataModeling Tools and others: TOAD, PLSQL Developer, SQL developer, Rapid SQL, Teradata, Erwin.

DBMS: Oracle 10g/11g/12c, Teradata, Netezza 6.x/7.x, DB2, Sql Server 2012, 2014, 2016.

Languages: SQL, PL/SQL, SQL*PLUS, C, Java, XML, HTML.

Operating Systems: Unix, MS Windows 2007/2008/2010.

BI, Reporting and other Tools: FACETS 4.61, SSIS 2008, SSRS 2008, Cognos, SAS BI Suite.

PROFESSIONAL EXPERIENCE:

Confidential, CA

ETL Informatica developer - Business Process Integration

Responsibilities:

  • Understanding existing business model and customer requirements in different local markets.
  • Implementation of the dataware house part in all the local markets based on the requirements.
  • Worked with Informatica power center 9.5.1 version.
  • Developed mappings using various transformations such as the Source qualifier, Router, Filter, Sequence Generator, Aggregator, Lookup (Connected & Un-Connected), Stored Procedure and Expression as per the business requirement.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Used SAS Data Integration Studio to develop various job processes for ETL (Extract, transform and load) data warehouse database.
  • Analyzed the Business requirement for OWB and Mapped the Architecture.
  • Modified OWB Mappings that populated large confidential tables.
  • Experienced with FACETS in managing healthcare requirements very effectively.
  • Worked with the business/functional unit to assist in the development, documentation, and analysis of functional and technical requirements within FACETS.
  • Performed database health checks and tuned the databases using Teradata Manager.
  • Designed SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Followed an extremely light and efficient Agile process, adapted to be most easily applied, and with that you may pursue and Confidential any time change a project plan; and others, battle-proven, easy and fast, to prioritize among projects and manage a program in an Agile manner.
  • Involved collection of product failure data, data preparation, extraction, manipulation and data analysis using SAS
  • Integrating data sets, small or large, streaming or not, to parallel process records
  • Worked extensively on the Netezza framework on UNIX and contributed to building the customized ELT framework using Shell scripting.
  • Synchronizing data sets between business applications, such as syncing contacts between NetSuite and Salesforce, effecting “near real-time “data integration
  • Handling large quantities of incoming data from an API into a legacy system
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Involved in configuration of FACETS Subscriber/Member application..
  • Worked on FACETS Data tables and created audit reports using queries. Manually loaded data in FACETS and have good knowledge on FACETS business rules.
  • Experienced with various Databases like Teradata, Oracle 11g/10g/9i, Db2 8x and 9x.
  • Involved in Facets Output generation, Interface development and Facets Migration Projects.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
  • Architected services to align with business objectives and fit into overall SOA driven program.
  • Worked with Enterprise Level Data Warehouse Data Modeler/Architect, Business Process designer.
  • Involved in impact analysis of HIPAA and 837P transaction sets on different systems as well as for ICD 9 to ICD 10.
  • Tested the changes for the front end screens in FACETS related to following modules, test the FACETS batches (membership).
  • Responsible for working with the State to review and modify process flows to increase productivity and effectively utilize FACETS features not provided by the legacy systems.
  • Involved in forward mapping from ICD 9 to ICD10 and backward mapping from ICD10 to ICD9 using GEM sand incorporated that into a Translator Tool.
  • Performed architecture and configuration analyses of DEV, QA, and PROD environments for several clients.
  • Created/modified various Informatica Mappings and workflows for the successful migration of the data from various source systems to oracle which meets the business requirements for reporting and analysis.
  • Played a Lead role for offshore team with 4 members and monitored.
  • Involved in performance tuning by identifying the bottlenecks Confidential source, target, mapping, session, and database level.
  • Implementation and up gradation of the billing system for all the local markets.
  • Design and development of UNIX Shell Scripts to handle pre and post session processes and also for validating the incoming files.
  • Installed and configured Informatica Power Exchange CDC 9.1.0 and 9.0.1 for Oracle on UNIX platform.
  • Responsible for designing, testing, deploying, and documenting the data quality procedures and their outputs.
  • Provided relevant outputs and results from the data quality procedures, including any ongoing procedures that will run after project end.
  • Developing the SQL scripts in TOAD, PLSQL developer and creating Oracle Objects like tables, materialized views, views, Indexes, sequences, synonyms and other Oracle Objects.
  • Excellent understanding of Star Schema modeling, Snowflake modeling.
  • Provided code approvals after peer review of the Data Stage Jobs, steered Committee meetings and conducted Impact analysis.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Created sessions, batches for incremental load into staging tables, and schedule them to run daily.
  • Proactively interacting with clients in understanding the requirements and fixing the issues.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Developing the unit test case document and performed various kinds of testing like unit testing, regression testing and system test in Dev, QA environments before deployment.

Environment: Informatica- Power Center 9.5.1, (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager), Facets 4.6.1, Facets interfaces (IFOX), Facets extensions, Oracle 11g, Netezza 6.X/7.X, Toad, PLSQL developer, SSIS 2008, Windows 7, WinSCP, UNIX shell scripts, Flat files.

Confidential, Chicago, IL

ETL Developer

Responsibilities:

  • Understanding the existing business model and requirements, involved in requirements gathering and understanding business needs.
  • Extensive experience in using ETL process using Informatica Power Center 8.x/7.x, Informatica cloud.
  • Have participated in exporting the sources, targets, mappings, workflows, tasks etc. And imported into the new Informatica 8.X, tested, reviewed to make sure that all the workflows are Execute as per the Design documents and Tech specs.
  • Worked with Teradata (V2R5 & 12) and created and manipulated many scripts in BTEQ, FastLoad, MultiLoad, FastExport, and Tpump.
  • Performed Informatica upgrade from V8.6.1 to 9.0.1 and 9.1.0.
  • Developed Triggers, VF Pages, and Controller classes. Created the web Service Client class, Wrapper classes for integration.
  • Migrated the whole application from one instance to the other and production. Data Migration (from Oracle to salesforce.com) using Data loader & Informatica cloud.
  • Extended the functionalities of existing ETL process of Medicaid for Healthcare.
  • Architected and developed Fast Load and MLoad scripts in control file, developed BTEQ scripts to process the data in staging server.
  • Developing the Mappings using needed Transformations in Informatica tool according to technical specifications
  • Created/modified various Informatica Mappings and workflows for the successful migration of the data from various source systems to oracle which meets the business requirements for reporting and analysis.
  • Data Profiling, Cleansing, Standardizing using IDQ and integrating with Informatica suite of tools.
  • Extensively used XML Source, XML target and XML Parser transformations
  • Imported Source and Target tables from their respective databases
  • Extensively worked with OWB to access wide varieties of data sources.
  • Developing Mapplets and Transformations for migration of data from existing systems to the new system using Informatica Designer.
  • Used CA7 Scheduler to schedule the Data Stage jobs.
  • Tuned Data Stage jobs for better performance to bring design parallelism.
  • Excellent understanding of Star Schema modeling, Snowflake modeling.
  • Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter,Expression, Router, Normalize, Joiner, Update Strategy, Rank, Aggregator, and Sorter.
  • Interacting with off-shore people and Database team if any issues.
  • Preparing the Documentation for the mapping according to the designed logic used in the mapping.
  • Preparing Test Cases and executing them along with the Testing team.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.

Environment: Informatica Power Center 9.1, Oracle 10g, IDQ, IDE, SQL, PL/SQL, TOAD 8.5, SQL developer, Teradata, Erwin, PeopleSoft, Mainframe, Unix, BO XI, Tivoli, AIX, Windows.

Confidential, Dallas, TX

ETL Developer

Responsibilities:

  • Understanding the existing business model and requirements, involved in requirements gathering and understanding business needs.
  • Involved in analyzing the requirements and created Design Documents and Data Mapping Documents.
  • Worked with Source Analyzer, Warehouse designer, Mapping, Mapplet Designer, Transformations, Workflow Manager and Monitor.
  • Developed mappings using various transformations such as the Source qualifier, Router, Filter, Sequence Generator, Aggregator, Lookup (Connected & Un-Connected), Stored Procedure and Expression as per the business requirement.
  • Used Mapping Parameters and Variables to implement object orientation technologies and facilitate the reusability of code.
  • Developed mappings to implement Slowly Changing Dimensions (Type 1 & 2).
  • Involved in performance tuning by identifying the bottlenecks Confidential source, target, mapping, session, and database level.
  • Design and development of UNIX Shell Scripts to handle pre and post session processes and also for validating the incoming files.
  • Developing the SQL scripts in TOAD and creating Oracle Objects like tables, materialized views, views, Indexes, sequences, synonyms and other Oracle Objects.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Developing the unit test case document and performed various kinds of testing like unit testing, regression testing and system test in Dev, QA environments before deployment.
  • Developed Audit Strategy to validate the data between Source and Target System.
  • Involved in Production Support.
  • Supported the Implementation and Post-Implementation process.

Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange Navigator 8.6.1, IDE, SQL Query Analyzer 8.0, Oracle 9i/10g, SQL Server 2005, SQL Server Management Studio, Toad, SQL Developer 1.1.2, PVCS, Tivoli, AIX, Sun Solaris & Windows NT, Shell Scripting.

Confidential

ETL Developer

Responsibilities:

  • Understanding existing business model and customer requirements.
  • Developed mappings using informatica for data flows from Source to Target.
  • Used various transformations like the Joiner, filter, lookup, expression, aggregator etc... In mappings to perform the ETL process.
  • Worked on mapping review, if design flaws found as a part of peer testing.
  • Created Effective Test Data and Unit Test cases to ensure successful execution of data loading processes.
  • Performed Informatica code migrations, testing, debugging and documentation.
  • Create and maintain documentation related to production batch jobs.

Environment: Informatica PowerCenter 8.1.3 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Ascential DataStage, MicroStrategy 7.1.5, Oracle 9i, TOAD 8.6.1, Sun Solaris & Windows NT, Shell Scripting.

We'd love your feedback!