We provide IT Staff Augmentation Services!

Etl-lead/data Modeler Resume

Mc-lean, VA

PROFESSIONAL SUMMARY:

  • Over 13 + years of experience in the IT industry .Main area of expertise is in Informatica, Erwin, Hadoop, Hive, Pig, Oracle, MS Sql server, Teradata, DB2, Unix,Cognos Report net,Obiee and Business objects.
  • Experienced in analysis of different systems for system study and data warehouse implementations.
  • Experienced in Data Analysis, Data modeling, ETL, Data Warehousing, Reporting, Development, Maintenance, Testing and Documentation.
  • Experienced in OLTP/OLAP Systems, Analysis, and Data model Schemas like Star schema, Snowflake schema and multidimensional modeling.
  • Experienced in Conceptual, LDM, PDM, Forward/Reverse Engineering and Complete compare methods using Erwin.
  • Well versed in Normalization (1NF, 2NF and 3NF) and Denormalization techniques for optimum performance in relational and dimensional database environments.
  • Extensive experience in implementation of Informatica Power Center components like Source Analyzer, Target Designer, Mapping/Mapplet/ Transformation Designer, Workflow Manager/ Workflow Monitor.
  • Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, SQL transformation, Data masking, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and Transaction Control transformations.
  • Experience in using different parsing techniques in Informatica B2B Data Transformation Studio.
  • Experience in using data sources Oracle, Teradata, DB2, Ms SQL server, flat files, and XML files.
  • Experience in writing complex SQL queries, stored procedures and functions using PL/SQL programming.
  • Having extensive experience on load Utilities like FASTLOAD, TPUMP, MLOAD, FAST Export, TPT (Teradata Parallel Transporter) and PDO (Push down Optimization) and used BTEQ scripts.
  • Experience in UNIX shell scripting and using PMCMD commands to run Informatica workflows.
  • Experienced in handling SCDs (Slowly Changing Dimensions) using Informatica PowerCenter.
  • Experience in change data capture (CDC).
  • Extensive experience in Performance Tuning in Identifying and fixing bottlenecks also tuned the complex Informatica mappings for better Performance.
  • Performed unit testing and documented test results.
  • Experience in Test Deployment of Informatica objects using Informatica deployment groups, and non Informatica objects using Eclipse and UNIX deployment process.
  • Experience in using Putty, Scheduling the workflows using Autosys,Zena DAC, Control M, Tidal scheduler.
  • Experience in tools Teradata SQL Assistant, DB Visualizer, SQL Developer, Toad and Ms Sql server management studio.
  • Experience in understanding the requirements and preparing data mapping documents.
  • Expertise in Data warehousing concepts like OLTP/OLAP System Study, Star schema and Snowflake schema used in relational, dimensional modelling.
  • Implemented type1/type2/type3/type4/type5/type6, Incremental and CDC logic according to the Business requirements.
  • Guided the source team (DBAs) to configure to support Informatica CDC at source end.
  • Guided the Dev team in Data Map creation, Registration Group creation, DB Row Test in Informatica PWX, Developing CDC mappings/workflows in Informatica Power Center and debugging any CDC related issue.
  • Hands on experience in Hadoop ecosystem components like HDFS, Sqoop, Pig, Hive,H - base and Oozie.
  • Expert in working with Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.
  • Experienced in pig scripts to build the ETL pipeline and created a datawarehouse on Hive.
  • Experience in using Apache Sqoop to import and export data from Oracle, Sqlserver to HDFS and Hive.
  • Hands on experience in setting up workflow using Apache Oozie workflow engine for managing and scheduling Hadoop jobs.
  • Good knowledge of No-SQL and MPP databases
  • Experience in using Hcatalog for Hive, Pig and Hbase.
  • Implement architecture best practices like error handling and recycling process on failures.
  • Experienced in Business Objects XI R2/XI R3.1.1 and SAPBI 4.0 (BO Reporter, Designer, Scheduler, Info View, Desktop Intelligence and Web Intelligence).
  • Experience with Web Intelligence and creating Ad-hoc reports and archiving the reports.
  • Experienced in Designing/Developing the User-Interface (Universe), CMC/CMS and scheduling the reports using Scheduler.
  • Installed and configured all the three environments (DEV/QA/PROD) with Business ObjectsXIR2/XIR3.2 with Apache and Tomcat as web and app servers on windows and UNIX servers.
  • Expertise in Designing, Building and Maintaining the Universes, resolving the issues like loops and Traps using Alias and Context, designing complex objects using @prompt, @ Aggregate aware function & Cascaded List of Values & Prompts. Experience in creating Universe by using views created over different data providers SQL Server, Oracle, DB2, MS Access.
  • Expertise in retrieving data using Business Objects Universes, personal data files, stored procedures and free hand SQL methods. Hands on experience on linking universes and Implementing object level and row/Column level security during the universe design.
  • Worked on Web Intelligence to develop, publish and schedule thin client reports.
  • Created ad-hoc reports with multi prompts to retrieve required data for the user and to minimize unnecessary load on the server.
  • Developed complex reports in WEBi and DESKi using date functions, Calculations, formulas & Filters and link reports and perform drilling using Open Doc function in InfoView.
  • Performed Slicing/Dicing, Multi Tabbing, Cross Tabbing, Drilling, and Scheduling and created Master Detail Reports.
  • Extensively worked on creating, executing test procedures, test cases and test scripts using Manual/Automated methods.
  • Maintain/document business rules, Provide communication and planning for implementing customers as necessary.
  • Strong experience in understanding business application, business data flow and data relations.
  • Worked on RDBMS like Oracle, Sql server,Teradata and DB2.
  • Good communication, analytical skills and flexible to learn advancements in the IT industry.
  • Excellent Client handling ability with good presentation skills.

TECHNICAL SKILLS:

ETL Tools: Informatica 7.1.1/8.6.1/ 9.0.1/9.5.1 /9.6.1, Informatica BDE.

OLAP Tools: Business Objects XI R2/XI R3.1.1/4.0, OBIEE,Cognos Report Net,SSRS.

BigData: Hadoop, Hive,Pig,H-Base,Sqoop,Oziee, HortonWorks

Databases: Oracle9i/10g/11g, Sql Server,Teradata 13 and IBMDB2, SQL Assistant, DB Visualizer,SQL developer,Toad

Data Modeler Tool: Erwin 5.1/7/9.5.

Scheduling Tools: Zena, Autosys, Control-M, Tidel and DAC

Domain Knowledge: Banking, Finance, Insurance, Telecom, Energy & Retail

Programming Languages: C,C++,Cobal,VB Scripting,SQL/PL/SQL, Unix Shell ScriptingTelecom Related Skills: SMS-MO, SMS-MT, GPRS and WAP.

Deployment and FTP Tools: Winscp,Filezilla,GIT,Sharepoint

Other tools: Quality Center, Jira, Maximo,IBM Clearcase, IBM Clearquest,Putty

Operating Systems: Windows 98/XP/2000/NT/7/8, UNIX and AIX.

PROFESSIONAL EXPERIENCE:

Confidential, Mc-Lean, VA

ETL-Lead/Data Modeler

Responsibilities:

  • Understand the Business requirements and convert into functional and technical specifications and developed maps for Extraction Transformation, Cleansing and Loading process of DW to make the deliverables at appropriate time.
  • Involved in the designing of the Data Model in deriving Conceptual, Logical and Physical model’s using Erwin with Architect teams. Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • His Contribution includes Requirements gathering, design, development, implementation and Client co-ordination.
  • Developing Informatica mappings and enhancing the current functionality of the mappings.
  • Extensively worked on creating, executing test procedures, test cases and test scripts using Manual/Automated methods.
  • Performing system testing and support in fixing the defects raised in phases of all SIT (System Integration Testing).
  • Managing a team of developers to achieve the project specific goals.

Environment: Informatica 9.6.1,Sybase,DB2, XSD,XML,Erwin 9.5, Beyond compare,Winscp,Autosys, MS Visio, Rapid Sql & Unix.

Confidential, Saint Louis, MO.

DW-BI Technology Lead/ETL-Architect/Data Modeler/Hadoop Developer

Responsibilities:

  • Understand the Business requirements and convert into functional and technical specifications and developed maps for Extraction Transformation, Cleansing and Loading process of DW to make the deliverables at appropriate time.
  • Involved in the designing of the Data Model in deriving Conceptual, Logical and Physical model’s using Erwin with Architect teams. Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • His Contribution includes Requirements gathering, design, development, implementation and Client co-ordination.
  • Developing Informatica mappings and enhancing the current functionality of the mappings.
  • Extensively worked on creating, executing test procedures, test cases and test scripts using Manual/Automated methods.
  • Performing system testing and support in fixing the defects raised in phases of all SIT (System Integration Testing).
  • Managing a team of developers to achieve the project specific goals.

Environment: Informatica 9.6.1,Informatica Power Exchange,Informatica BDE, Hadoop,Hive,Pig,Erwin 9.5,Oracle11g,DB2,Sql Sever,Autosys, MS Visio,Toad & Unix.

Confidential, Hartford, CT.

DW-BI Technology Lead/Data Architect/Data Modeler/Hadoop Developer.

Responsibilities:

  • Understand the FWA-fraud, waste, and abuse Usecase/Business requirements (Claims,Providers,Members,SIU History and EWM) and convert into functional and technical specifications and developed maps for Extraction Transformation, Cleansing and Loading process of DW to make the deliverables at appropriate time.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Created Hive scripts for extracting the summarized information from hive tables.
  • Implmented pig scripts to build the ETL pipeline and created a datawarehouse on Hive.
  • Developed Hive and pig scripts extract data from staging tables.
  • Created test plan documents for the each work requests.
  • Executed Unit testing and data validations for each work request.
  • Used GIT Repository for push and pull the code to New Branches to Master repository and then raise a merge request for reviewing and accepting the code to move Test and Production environments.
  • Developed the wrapper and common scripts for job automation process
  • Created one time metadata model for hive, pig and sqoop jobs based on different applications using Mysql database.
  • Developed workflow in Oozie to orchestrate a series of Pig scripts to cleanse data, such as removing personal information.
  • Raised and assainged issues,tickets and tasks using JIRA.
  • Uses Pig in three distinct workloads like pipelines, iterative processing and research
  • Involved in moving all log files generated from various sources to HDFS for further processing through Flume and process the files by using some piggybank.
  • Provided ad-hoc queries and data metrics to the Business Users using Hive, Pig
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
  • Gained experience in managing and reviewing Hadoop log files.
  • Good understanding of ETL tools Informatica BDE and how they can be applied in a Big Data environment.
  • Managing a team of developers to achieve the project specific goals.

Environment: Hadoop, Sqoop, HDFS, Hive,Pig,Hbase, Oozie, HortonWorks,Informatica BDE, MS Visio and Unix.

Confidential, Charlotte, NC.

DW-BI Technology Lead /ETL-Architect/Data Modeler/BO Developer.

Responsibilities:

  • To study and understand business and technical requirements of the system.
  • Coordinated and provide the proper inputs to Confidential Informatca Admin for Informtica environment setup and UNIX admin for directories and paths for Confidential Energy FRE Project.
  • Coordinated the onshore teams for Data mapping Gap analysis, Logical and physical Data model gaps using Erwin 4.5.
  • Created the design documents SDS (System Delivery Specification-ETL) and TSD (Technical System Design Document-ETL) Confidential Energy FRE Project.
  • Prepared the Impact analysis documents for each work requests from the business.
  • Estimations done on each release based on the work requests.
  • Shared the work requests to each team members and assigned the work to each team member.
  • Created the design and test plan documents for the each work requests.
  • Uploaded the designing documents to share point (FRE) for Onsite Team Approval.
  • Developed Mappings using source Qualifier, Joiner, lookups, Expression aggregator, filter, router, update strategy, transformations.
  • Created sessions and workflows using Informatica 9.6.1.
  • Involved in performance tuning at various levels including target, source mappings and session for large data files.
  • Worked on different loads like Fast Load, Fast export, Multi load and familiar with Tpump.
  • Used BTEQ Commands to export and Import data.
  • Executed Unit Testing and data validations for each work request.
  • Prepared the consolidated promote checklists for the QA in testing environment.
  • Migrated the mappings from Dev to QA using Informatica repository manager.
  • Compared the mappings from Dev to QA before QA run and created the deployments groups
  • Implemented the shell scripts, pram files and scheduling the jobs using cron in UNIX to invoke Informatica workflows.
  • Conducted regression testing using Informatica (DVO) Data Validation Option tool.
  • Prepared the consolidated promote checklists for production environment.
  • Prepared the ETL process flow documents for UAT.
  • Involved in analyzing business requirements and data specifications for Business Objects Universes and Reports.
  • Performed project data audits to ensure quality and data integrity.
  • Resolved loops, chasm and fan traps using contexts, aliases and integrity checks
  • Created the Universes using Business Objects XI R3.1.1
  • Created Reports using thin client reports using Web I.
  • Scheduling Reports using the scheduler.
  • Linked Universes based on the requirements.
  • Publishing and scheduling the documents.
  • Created the reports using Business Objects functionalities like Queries, Drill Down, Cross Tab, and Master Detail.
  • Used Drill Up/Down operations done for different aspects of client business.
  • Overall business objects migration from old database to new database
  • Managing a team of developers to achieve the project specific goals.

Environment: Informatica 9.6.1, Informatica DVO, Netezza, Matrix Database, Vector Database,FileZilla, Putty, WINSCP, DB-Visualizer, Flat Files,Business objects XI R3.1.1, Oracle11g,Teradata 13,Erwin 9, MS Visio, Toad & Unix.

Confidential, Charlotte, NC.

DW-BI Technology Lead/Data Architect/Data Modeler

Responsibilities:

  • Understand the Business requirements and convert into functional and technical specifications and developed maps for Extraction Transformation, Cleansing and Loading process of DW to make the deliverables at appropriate time.
  • Did POC for the unstructured data loading like (PDF, Social Media Twitter, weblogs, HTML) into HDFS & Hive environments using Informatica Developer Client & Informatica Big Data Edition (BDE) tools.
  • Involved in the designing of the Data Model in deriving Conceptual, Logical and Physical model’s using Erwin with Architect teams. Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • His Contribution includes Requirements gathering, design, development, implementation and Client co-ordination.
  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification into simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Involved in building the ETL architecture, Change Data Capture (CDC) and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Configured Landing, Staging and Loading processes, Trust and Validation rules, Match and Merge process using Informatica MDM.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Used Hierarchy Manager for configuring entity base objects, entity types, relationship base objects, relationship types and profiles.
  • Experienced working with Services Integration Framework (SIF), EJB modules and Web services.
  • Worked on Real Time Integration between MDM Hub and External Applications using Power Center and SIF API for JMS.
  • Experience with MDM HUB real-time development expertise and detailed technical knowledge Follow MDM SIF coding standards and best practices.
  • Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing and data manipulation.
  • Involved in Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
  • Worked on the Data modeling (Dimensional & Relational) concepts like Physical, Logical, Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Worked with big volume data using Netezza features - Distribution, Organize, nz migrate, Groom and Statistics.
  • Working closely with user decision makers to develop the transformation logic to be used in Informatica Power Center.
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of studies for Daily and monthly Loading of Data.
  • Used stored procedures drop and create indexes before and after loading data into the targets.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in code migration from between repositories and folders.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Captured data error records corrected and loaded into target system.
  • Implemented efficient and effective performance tuning procedures.
  • Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.
  • Interfacing with and supporting QA/UAT groups to validate functionality.
  • Extensively used Eclipse tool to write the Java programs.
  • Created Single Table pairs and executed test cases using Informatica Data Validation tool.
  • Used Power Center sources, SQL views, join views and lookup views in Informatica Data Validation tool to test the scenarios.
  • Worked on generating various dashboards in Tableau Server using different data sources such as Oracle, Netezza, DB2 and Created report schedules, data connections, projects and groups.
  • Worked closely with business power users to create reports/dashboards using tableau desktop.
  • Extensively worked on creating, executing test procedures, test cases and test scripts using Manual/Automated methods.
  • Performing system testing and support in fixing the defects raised in phases of all SIT (System Integration Testing).
  • Managing a team of developers to achieve the project specific goals.

Environment: Informatica 9.6.1, Informatica BDE, Informatica MDM 9.5, Informatica IDD and IDQ, Informatica DVO, Informatica B2B Data Transformation Studio, SIF, Web Services, Netezza, Erwin 7, Oracle11g, Hadoop,Hive, ER-Studio, Putty, winscp, FileZilla, Flat Files,Beyond,Tableau and Unix

Confidential, Branchville, NJ.

DW-BI Senior Application Developer.

Responsibilities:

  • To study and understand business and technical requirements of the existing Dragon, ODS, ODS Claims, ODS-Reporting, Stone creek and E&S systems.
  • Prepared the Impact analysis documents for each work requests from the business.
  • Shared the work requests to each team members and assigned the work to each team member.
  • Created the design and test plan documents for the each work requests.
  • Uploaded documents in share point and shared path for signoffs from Application owners.
  • Changed the DDL, DMLS, Mappings, Sessions, workflows, Parmfiles and batch files according to the design documents using Informatica 9.0.1/9.5.1 in Development Environment.
  • Executed Unit Testing and data validations for each work request.
  • Used UNIX shell scripting and PMCMD commands to run Informatica workflows.
  • Run the informatica jobs by using Zena scheduler
  • Prepared the consolidated promote checklists for the QA in testing environment.
  • Migrated the mappings from Dev to QA using Informatica repository manager.
  • Compared the mappings from Dev to QA before QA run and created the deployments groups
  • Created the bat scripts and run the Jobs using Zena Third party scheduling tool.
  • Conducted regression testing using Informatica (DVO) Data Validation Option tool.
  • Prepared the consolidated Production promote checklist for production environment.
  • Worked on production fixes and provided the solutions.
  • Estimations done on each release based on the work requests.
  • Trained on MSBI SSIS and SSRS tools and worked the existing packages, report modifications for the Selective users.
  • Modified the Store procedures for music reports and tested the reports in development and test environments.
  • Supported the production support activates.

Environment: Informatica 9.0.1/9.5.1, SSIS,SSRS,SSAS,Zena,MS Sql Server, SQL Server Management Studio,Oracle,Sql Developer, Putty,winscp,Toad and Unix

Confidential, Newyork, NY.

DW-BI Senior Application Developer/ Data Modeler.

Responsibilities:

  • To study and understand business and technical requirements of the RPM, RRDW, CMP systems.
  • Created the design documents as per the new Requirements.
  • Developed Mappings using source Qualifier, Joiner, lookups, Expression aggregator, filter, router, update strategy, transformations.
  • Implemented the different mapping and Mapplet logic according to the Business requirements from Analytical Certification Stake Holder.
  • Created sessions and workflows using Informatica 8.6.1
  • Involved in performance tuning at various levels including target, source mappings and session for large data files.
  • Worked on different loads like Fast Load, Fast export, Multi load and familiar with Tpump.
  • Used BTEQ Commands to export and Import data.
  • Responsible for gathering the requirements from the Users and designed the Detailed design specifications for the Business Objects Universes and Reports.
  • Involved in analyzing business requirements and data specifications for Business Objects Universes and Reports
  • Created Cardinalities, Contexts, Joins, External Strategies and Aliases for resolving Loops and checked the Integrity of the Universes. Exported the Universes to the Repository to make resources available to the users.
  • Creation of Universes by retrieving the data from various data sources, defining the connections necessary. Generating various numbers of reports on a daily, weekly and monthly basis.
  • Created complex conditions (Dates), Prompts and free hand custom SQL to create Derived table in universe. Defined logical classes and objects and hierarchies for analysis and drilling on hierarchies in the WEBI reports.
  • Created predefined conditions in the universe level to restrict the volume of data to be pulled in the report.
  • Worked extensively with @functions (@select, @prompt, @where) to overcome the problems with database when integrating with Business Objects.
  • Interaction with end users, regularly in order to generate the reports required, these reports were generated using Business Objects functionality such as Slice and Dice, Master/detail, User Responses and Formulas.
  • Created reports for various Portfolios using the Universes as the main Data Providers.
  • Addressed Chasm trap and Fan Trap using Multi Parse SQL.
  • Scheduled reports through CMC and InfoView. Developed complex Webi reports.
  • Involved in production support, moving objects from DEV - TEST - Production using import wizad.
  • Used multiple data providers, Union, Intersection, minus and Master/Detail, cross tab and Charts to create Webi reports.

Environment: Informatica 8.6.1, BOXIR3.1.1, Erwin 7, Teradata, Teradata SQL Assistant, Putty, Tidal scheduler, Oracle11g, Toad and Unix.

Confidential, Austin, TX.

DW-BI Senior Application Developer/Data Modeler.

Responsibilities:

  • To study and understand business and technical requirements of the Confidential systems.
  • Created the design documents as per the new Requirements.
  • Developed Mappings using source Qualifier, Joiner, lookups, Expression aggregator, filter, router, update strategy, transformations.
  • Implemented the CDC logic according to the Business requirements from Confidential stake holder.
  • Creating sessions and workflows using Informatica 8.6.1.
  • Involved in performance tuning at various levels including target, source mappings and session for large data files.
  • Prepared the bat scripts, parm files and run the scripts using windows scheduler to invoke Informatica Workflows.
  • Executed Unit Testing and data validations for each job.
  • Used UNIX shell scripting and PMCMD commands to run Informatica workflows
  • Run the informatica jobs by using Autosys scheduler.
  • Used mapping debugger for data and error conditions to get trouble shooting information.
  • Migrated and compared the mappings and workflows from Development to UAT and Production using Repository manager.
  • Coordinating the onshore team to resolve the issues.

Environment: Informatica 8.6.1, Erwin 5.1, Oracle10g, SQL, Teradata,Teradata SQL Assistant, Putty,winscp,Autosys,Toad and Unix

Confidential, San Mateo, CA.

DW-BI Application Developer.

Responsibilities:

  • Involved in analyzing business requirements and data mapping iASAP Tax basis specifications for ETL Mappings and Business objects reports.
  • Coordinated and provide the proper inputs to FTT Informatca Admin for informtica environment setup, BO Admin for business objects setup and UNIX admin for directories and paths for iASAP tax basis Project.
  • Coordinated the onshore teams (GDG, IMG, and Ad Teams) for Data mapping Gap analysis, Logical and physical Data model gaps.
  • Created the design documents SDS (System Delivery Specification-ETL) and TSD (Technical System Design Document-ETL) for iASAP tax basis Project.
  • Uploaded the designing documents to share point (FTT) for Onsite Team Approval.
  • Developed Mappings using source Qualifier, Joiner, lookups, Expression aggregator, filter, router, update strategy, transformations.
  • Created sessions and workflows using Informatica 8.1.1.
  • Executed Unit Testing.
  • Implemented the shell scripts, parameter files and control-M Scheduling to invoke Informatica workflows.
  • Prepared the test cases, test data, and test results for iASAP tax basis project.
  • Migrated and compared the Mappings and workflows from Development to Test environment using Repository manager.
  • Created the ETL process flow document for UAT.
  • Created the universe for iASAP Tax Basis reports-10/31 Excise date, FYE using Business Objects XI R2.
  • Created the reports (open-1256 futures, options and forwards, Non-open-1256 futures, options and forwards) using Web I.
  • Scheduling the reports using the scheduler.

Environment: Informatica 8.1.1, Erwin5.1, Business objects XI R2, Sql Server,Greenplum,Oracle, SQL,PL/SQL, Putty, winscp, Control-M and Unix.

Confidential, Plano, TX

DW-BI Application Developer.

Responsibilities:

  • To study and understand business and technical requirements of the existing GECARS, Non-GECARS, ERAM, GD, CRMS, EDW and CDW systems.
  • Coordinated with GAMS team for existing Informatica environment setup, UNIX directories and paths and oracle database setup for CDW Project.
  • Prepared the Impact analysis documents for each work requests from the business.
  • Prepared high level mapping design documents from requirements
  • Created the design and test plan documents for the each work requests.
  • Uploaded documents in Confidential share point and shared path for signoffs from Application owners.
  • Changed the DDL, DMLS, Mappings, Sessions, workflows, Parmfiles and batch files according to the design documents using Informatica 8.6.1 in Development Environment.
  • Developed mappings for Audit reconsolidation of daily loads.
  • Implemented changed data capture by using control tables.
  • Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.
  • Analysed session logs, session run properties, Workflow run properties.
  • Analysed rejected rows using bad files.
  • Used UNIX shell scripting and PMCMD commands to run Informatica workflows.
  • Performed deployment of informatica code.
  • Run the informatica jobs by using DAC scheduler.
  • Involved in performance tuning at various levels including target, source mappings and sessions for large data.
  • Applied the performance techniques at Informatica level and fine tuned the SQL Queries at database end to complete the jobs within SLA time.
  • Executed Unit Testing and data validations for each Job.
  • Migrated the mapping from Dev to QA and QA to UAT environments.
  • Involved in analyzing business requirements and data specifications for Business Objects Universes and Reports.
  • Creating Reports using Desktop Intelligence and thin client reports using Web I.
  • Scheduling Reports using the scheduler.
  • Linked Universes based on the requirements.
  • Publishing and scheduling the documents.
  • Created the reports using Business Objects functionalities like Queries, Drill Down, Cross Tab, and Master Detail.
  • Used Drill Up/Down operations done for different aspects of client business.

Environment: Informatica 7.1, Business objects XI R2, Oracle 9i, DB2, SQL,PL/SQL, Putty, Unix Shell Scripting, DAC,winscp,Toad and Unix.

Hire Now