We provide IT Staff Augmentation Services!

Sr. Informatica Consultant Resume

SUMMARY

  • 12+ years of professional IT work experience in Informtica Cloud IICS, Informatica Power Center 10.x, SQL, Pl/Sql, Control - M scheduler, data modelling and Business Objects -Data warehousing experience in Finance and healthcare domain.
  • Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, Deployment and Lead the projects with a team size of four.
  • Involved in creating Dimensional Data Modeling and creating Fact table and Dimensional tables;
  • Experienced in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling.
  • Extensive experience in extraction, transformation and loading of data directly from various data sources like flat files, XML, SQL Server, Oracle.
  • Experienced in Capturing Incremental data (CDC) from source systems.
  • Experience in Informatica Data Quality ( IDQ 8.x ) on Analyst and Developer tools.
  • Extracted data from EDI source system and files to data warehouse on weekly basis extensive experience in Informatica Cloud - ICS, ICRT, data synchronization services, data replication tasks, tasks. Integration templates. Salesforce CRM, XML Flat file, CSV, Redshift (AWS)
  • A good understanding and experience with Business Objects Reports.
  • Good knowledge on Partition By Range, Partition By Hash Function,Sub Partition,List Partitions.
  • Extensive experienced in writing stored procedures (PL/SQL), triggers, Functions and Packages.
  • Strong experience in using Informatica powercenter to integrate data with different sources/targets.
  • Executed, scheduled taskflows using Informatica Cloud tool to load data from Source to Target.
  • Used Source qualifier, expression, filter transformations in Informatica cloud for business logic
  • Experience of Bulk Insert, BCP utilities and scripts for data transforming and loading.
  • Experience of designing strategies to maintain audit tables, load balance, exception handling and high data volumes.
  • Experience in Loading and Extracting Data using Teradata Utilities like Fast Load, Multi Load, TPUMP, Fast Export, Teradata Parallel Transporter as services in Informatica.
  • Having Sound knowledge on Teradata Architecture and Data Distribution techniques.
  • Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support.
  • Extesnsive exposure to SOA and BPM. Valiated XMLs with XSD and understanding of WSDL.
  • Experience with software development life cycle (SDLC) and project management methodologies.
  • An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation. A quick learner with an aptitude for taking responsibilities.

TECHNICAL SKILLS

Languages: Unix Shell Scripting (Ksh,bash), PL/SQL Program, Windows PowerShell, Batch Scripts

Databases: Oracle12c/11g/10g/9i/8i/7x, Teradata, SQL Server Integration Services (SSIS), DB2,UDB 7.2, My SQL 5.0/4.1. MS SQL Server Mgmt Studio

ETL/Reporting: Informatica Power center 10.x, Informatica Cloud ICS/IICS, ODI 12c, Talend, BO, Tableau

Data Modeling: ERWIN 4.x/3.x, Star & Snow Flake Schema,, Physical And Logical Modeling, Dimension Data Modeling.

Cloud Practice: AWS EC2, EFS, Azure VM, RDS Oracle/SSMS, aws CLI

PROFESSIONAL EXPERIENCE

Confidential

Sr. Informatica Consultant

Responsibilities:

  • Implementing design and establish development standards for refactoring changes.
  • Migration and rebuild the data mappings to various environments
  • Tuning the informatica code for better optimization
  • Extract data from oracle sources and load into ODS and data marts.
  • Worked with the team Manager and Business Analysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
  • Performed problem assessment, resolution and documentation, and, performance and quality validation on new and existing BI environments.
  • Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
  • Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
  • Responsible for implementing action items identified during code review sessions focused towards improving data quality and performance.
  • Involved in performance tuning of sessions that work with large sets of data by tweaking block size, data cache size, sequence buffer length and target based commit intervals.
  • Developed sessions and batches to move data at specific intervals and on demand using workflow manager.
  • Migrated data from oracle legacy data to postgres target data model
  • Worked on Data Validation Option (DVO) to validate data between different environments.
  • Installed informatica power center 10.2 in ec2 server for multi node in dev environment.
  • Configured grid in development environment for better performance and high availability

Environment: Informatica power center 10.2, DBViewer, SQL Developer, UNIX,Oracle 12 c,Cognos 10.2.1, Postgres

Confidential

ETL Lead

Responsibilities:

  • Facilitate understanding and documenting requirements, use cases, technical solution and testing scenarios, and translating the same to offshore Development team
  • Responsible for facilitating development team in understanding the technical approach to execute the requirements, guiding and mentoring the team members.
  • Identifying risks and issues and taking the necessary steps to address them upfront
  • Owning the preparation of the estimations to the level of data elements and the code, coordinating with the client to explain and justify the estimates while coordinating and complying with the required deal review process and meeting the required standards
  • Seek clarifications from the client or business, as required from time to time in order to ensure smooth and timely delivery of assignment in hand
  • Workig with MS Access database for analysisng source data elements and attributes.
  • Responsible for understanding the key system and environment to get a strong grip of the purpose, objectives and process flows of each system and how it will it play its role in the project
  • Helping in preparing a project plan, monitoring the progress on daily basis, reporting the progress to the delivery management and the client on regular basis
  • Ensuring that the deliverables are properly tested by reviewing the code, functionalities, documentation before delivery is completed
  • Worked with Informatica power exchange and Informatica cloud to integrate Salesforce and load the data from Saleforce to Oracle db.
  • Installed Informatica cloud secure agent in windows and linux.
  • Developing the Informatica cloud mappings to load the data through task flows.
  • Worked with Informatica Cloud to create Source/Target connections,monitor,synchronize the data through DSS and crated replica with DRS.
  • Worked with Informatica cloud for creating source and target objects, developed source to target mappings.

Environment: Informatica power center 9.X,Toad 9.1,SQL Developer, Informatica Cloud, Teradata 13, UNIX,Oracle 11g, RedShift, Windows.

Confidential

Sr. Informatica Developer/Admin

Responsibilities:

  • Created the ETL technical specification for the effort based on business requirements. The source system being mainly Oracle Master Data.
  • Worked closely with Business Analyst and Shared Services Technical Leads in defining technical specifications and designs for Oracle based large data warehouse environment.
  • Developed detailed ETL implementation design based on technical specification for BI effort within the ETL design standards and guidelines.
  • Served as S&P’s ETL expert - performing technical development, maintenance, tuning, and support activities.
  • Created unit testing scripts and representative unit testing data based on business requirements.
  • Ensured testing data is available for unit and acceptance testing within development and QA environments.
  • Unit tested ETL code/scripts to ensure correctness of functionality and compliance with business requirements.
  • Refined ETL code/scripts as needed to enable migration of new code from development to QA and QA to production environments following the standard migration signoff procedure.
  • Coordinated any required Unix related scripts in support of the ETL deployment working with the ETL Admin.
  • Worked with the team Manager and Business Analysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
  • Loading process was done, using ETL tool, Bteq scripts and Teradata utilities like Fast Load, Multi Load and TPUMP
  • Leverage FastExport utility to extract data from Tables.
  • Performance tuning of Queries based on the Explain Plan from Teradata optimizer.
  • Performed problem assessment, resolution and documentation, and, performance and quality validation on new and existing BI environments.
  • Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
  • Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
  • Worked on Informatica Power Center 9.1 tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
  • Participated in deployment planning and in deployment of the system to production
  • Facilitated business user smoke testing of the production system by setting up test data.
  • Involved in production support duties including monitoring of nightly batches.
  • Responsible for updating business stakeholders and OLTP/OLAP application support teams about the status of various ETL sessions and the impact of failed sessions on data availability.
  • Responsible for debugging failed ETL sessions and taking steps to mitigate impact to business users.
  • Extensively used informatica Informatica Data Quality tool (IDQ Developer) to create rule based data validations for profiling
  • Created dictionary tables using IDQ analyst tool for data validations
  • Implemented B2B informatica parsers to convert PDF documents to XMLs through Informatica
  • Involved in code review sessions to bring the team’s attention to best practices and to identify areas of improvement.
  • Responsible for implementing action items identified during code review sessions focused towards improving data quality and performance.
  • Involved in performance tuning of sessions that work with large sets of data by tweaking block size, data cache size, sequence buffer length and target based commit intervals.
  • Developed sessions and batches to move data at specific intervals and on demand using workflow manager.
  • Worked on different modules of the project for successful execution
  • Have created / developed different types of profiles like Column level profiling, Summary profiles, drill down profiles, Score cards, reports etc. using IDE.
  • Have created Match, and Merge rules, developed address validations etc. and developed address validations to countries like US, and also developed reusable error handling rules using IDQ.
  • Expertise and working experience in establishing the process, methodology standards at all phases of an Enterprise Data warehouse.
  • Extensive experience in the areas of Systems Analysis, Fit-Gap Analysis, Business Process Reengineering, Product Configuration & Implementation, Configuration Management, Development & Deployment ETL tools & Reports, conducting testing and validations, conducting user training, documentation and Production Support.
  • Very strong analytical, technical and business skills in architecting and designing using ETL applications.
  • Technical leadership, mentoring and managing senior designers, developers and development teams.

Environment: Informatica power center 8.6/7.X, IDQ Developer, Oracle 11g, PL/SQl,Toad, DB, UNIX, Informatica B2B, Windows, SQl.

Confidential

Programmer/Sr. Informatica ETL Developer

Responsibilities:

  • Understood the Business point of view to implement coding using Informatica Power Center 9.1/8.6.1.
  • Extensively used Informatica Power Center 9.1/8.6.1 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Involved in design and development of complex ETL coding in an optimized manner.
  • Redesigned some of the existing mappings in the system to meet new functionality.
  • Optimized performance by tuning the Informatica ETL code as well as SQL.
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update Strategy, Sorter, XML, Lookup, Aggregator, Joiner and Stored Procedure transformations in the mapping.
  • Developed Informatica SCD Type-I and Type-II mappings.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Implemented Error Rejection process to load bad/invalid records into a separate reject table.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Involved in various testing activities like database testing, unit testing, system testing, performance testing and was also responsible for maintaining of testing metrics, defect tracking.
  • Identified and debugged the errors before deploying and worked on migration of the maps and workflows from development to UAT and from UAT to Production.
  • Designed the Technical specification document and the implementation method documents and shared with Integration team and other ETL project teams.
  • Designed the ETL runs performance tracking sheet in different phases of the project and shared with Production team.
  • Prepared the validation report queries and executed after every ETL runs and shared the resultant values with Business users in different phases of the project.
  • Configured the ETL workflows with control tables of warehouse to extract the data on requirement basis.
  • Written shell scripts at workflow command level to Archive the spreadsheet provided by Business users.
  • Coordinated with the operations team for scheduling the Work Flows and processed the flat files with data wise.
  • Tested the mappings generated by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Supporting upcoming enhancements to global data warehouse for corporate wide data requests including ETL modifications and next generation reporting capabilities.
  • Extensively used Informatica to extract from different source systems and load into the target Oracle Base tables.
  • Extracted sources from EDI sources to data warehouse through Informatica ETL’s on weekly basis
  • Exclusively worked with production team in resolving processing difficulties with control tables and shared objects.
  • Worked on data base changes and created required indexes and applied post session commands in ETL.
  • Implemented the session level parallel processing performance method to extract the data from large sources.

Environment: InformaticaPowerCenter 9.1.1/8.6.1 , Oracle 11g, Flat Files, TOAD, Windows XP, UNIX, ODS, PL/SQL, SQL, Business Objects, Erwin, Toad, Omnidex, Maestro

Hire Now