We provide IT Staff Augmentation Services!

Informatica Tech Lead Resume

3.00/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • Highly Skilled professional with over 15+ Years of total IT experience in the areas such as Business Requirements Analysis, Application Design, Data Modeling, Development, Implementation and Testing. Domain Industries such as Pharmaceutical, GAMING, Health Care, Energy, Educational, Insurance, Financial and Supply Chain.
  • Expert knowledge in OLAP concepts, Data warehousing concepts and Dimensional Modeling (Star and Snowflake Schema).
  • Vast IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications.
  • Experience in Erwin 4.x Data Modeling tool.
  • Extensive database experience using Oracle 11g/9i/8i/7.x, MS SQL Server, MS Access, DB2, AWS Redshift (cloud computing database), PostgreSQL
  • Enormous experience in extracting data from multiple sources like SAP R/3, BW, SFDC, Siebel, JD Edwards, Oracle, AWS Redshift, DB2, SQL Server, AS400 and Flat files, PLSQL, transforming data using various transformations and then loading data to targets like Oracle, SAP, Flat files, Netezza.
  • Experience in Development and code reviews, execution of unit test plans, and migrating mappings to different Environments.
  • Developed ETL with SCD’s, caches, complex joins with optimized SQL queries
  • Strong knowledge of Software Development Life Cycle (SDLC) including requirement, analysis, Design, development, testing, and implementation.
  • Experience with WinSQL and TOAD to test, modify and analyze data, create indexes, and compare data from different schemas.
  • Experience in configuring the workflows and sessions into DAC, Control M, Redwood Cronacle and scheduling them.
  • Created workflow instances and improved the performance with large data using round robin, Key range partitions. Experience in using advanced techniques like pushdown optimization.
  • Experience in extracting the data from Sales Force (SFDC) to Netezza using Informatica Cloud and with other source systems like Flat file, SQL Server, DB2.
  • Extensively worked on different tasks in Informatica Cloud to fulfill the business requirements. Tasks utilized are Data Synchronization, Data Replication, Mapping, Power Center
  • Extensively Worked on Informatica Data Quality (IDQ) 10.1 for data cleansing, data matching and data conversion, robust data, remove the unwanted data, correctness of data.
  • Widely used IDQ transformations like Address Validator, Parser, Standardizer, Match, Merge, Filter, Comparison, Consolidation, Merge, Case Converter, and Data Masking
  • Data Profiling Using Informatica data quality Analyst and Developer tool.
  • Experience in UNIX shell scripting (Korn/C) and PL/SQL procedures.
  • Participate in migrate the code from Development to Test, Production and provided various migration documentations like Environment preparation, deployment components, Batch execution instructions and DFD’s for jobs
  • Excellent analytical, problem solving, communication, and interpersonal skills. Ability to interact with individuals at all levels.

TECHNICAL SKILLS

DW/ETL Tools: Informatica PC 10.x/9.x/8.x/7.x, Informatica DQ 10.1,SSIS

Modeling Tools: Physical, Logical, Dimensional Modelling, ERWIN 4.5/4.0

Databases: DB2, Oracle, SQL Server, Netezza, MS Access, AWS Redshift, PostgreSQL

Query Tools: WINSQL,TOAD, SQL Navigator, SQL Developer

Database Load Tools: SQL Loader (Oracle)

Languages: SQL, PL/SQL, UNIX Shell Scripting

Operating System: Windows, UNIX, Linux

Cloud Technologies: 1010Data, Informatica cloud, IICS

Scheduling tools: Redwood Cronacle, Control - M

Incident Management: Service-Now, BMC Remedy Change Management

PROFESSIONAL EXPERIENCE

Informatica Tech Lead

Confidential - Houston, TX

Responsibilities:

  • Gathered information and requirements for migrating data from one system to another.
  • Schedule and run meetings with business teams for project scope for the changes being requested.
  • Request for high level estimation from impacted teams and answer any queries which may arise from impacted teams.
  • Assessing client’s methods of business reporting and utilizing them in building data warehouses.
  • Applied judgement in analyzing data requirements for data warehouse components.
  • Proficiency in IDQ development around address validation services, data profiling, cleansing, parsing,
  • Data standardization, verification, matching and data quality on the incoming source data.
  • Exception monitoring and handling
  • Have worked on IDQ for Reference table, Profiling and Address Doctor.
  • Developed Data quality frameworks for Cloud and on-premise environments using Analyst and developer tools.
  • Applied the rules and profiled the source and target table's data.
  • Involved in extracting data from SAP systems using Informatica Power Center.
  • Responsible in designing and developing Informatica mappings as per Business Requirement.
  • Conduct Unit, Integration and UAT Testing to meet the Customer needs.
  • Troubleshooting production issues within integration platform leveraging informatica, DB2 utilities, Unix Scripts and SQL.
  • Responding to the critical production outages and drive to root cause analysis.
  • Writing complex SQL queries and performance tuning.
  • Creating Unix script for file transfers and file manipulation.
  • Using WinSQL for querying target table for Data validation and data conditioning.
  • Using Redwood Cronacle for monitoring and scheduling informatica jobs and scripts as per business requirement.
  • Working with offshore team in handling the daily incident and resolving without impacting the business process.
  • Wrote complex queries for creation and modification of data.

Environment: Informatica 10.2/9.6.4, DB2, AS 400, Informatica developer (IDQ), Informatica Analyst, UNIX scripting, Redwood Cronacle for Scheduling, WinSQL, WinSCP, Putty, ServiceNow for incident management.

Principal Architect

Confidential- Houston, TX

Responsibilities:

  • Extracted data from Redshift to staging tables (AWS Redshift) and to SAP BW (BPC) tables.
  • Extracted data from SAP R/3 and SAP BW tables into staging tables in AWS Redshift.
  • Developed Informatica mappings to load data from Redshift S3 Buckets to staging to tables.
  • Developed Informatica mappings logically same as BODS (Data Services) jobs.
  • Created UNIX scripts to hit the Netezza query in the Redshift database and place the data in the file format in S3 buckets, and from there copy the files to Informatica Local folder for data extraction.
  • Created UNIX scripts to update the status into Job Notification tables.
  • Involved in preparing UTP documents and Delivery sheets for the developed build.
  • Prepared Hand's on document for all the objects related to the project.
  • Performed in Unit Testing and tuned the mappings for better performance.
  • Responsible for doing the peer review of others developed objects.
  • Deployed the code across all environments using Deployment group.
  • Created new jobs and updated existing jobs in Redwood Cronacle.
  • Monitoring the jobs and rectifying the issues occur during production support (Redwood Cronacle).

Environment: Windows 2000/Professional, BODS 4.2 (SAP Data Services), Informatica 9.6.1, Netezza, UNIX, AWS Redshift, SAP R/3, SAP BW, Redwood Cronacle

Principal Architect

Confidential- Houston, TX

Responsibilities:

  • Involve in the design, development, testing, and implementation of ETL jobs (Informatica, SAP BODS, SSIS) to extract, transform and load data into target system.
  • Work with the client teams for Technical design related developments.
  • Use database server to stage and transform the client data with the business rules and to load into the target system.
  • Install the ETL Application and configure the repositories using DB2 / Oracle database.
  • Schedule the jobs through Redwood Application and perform the error handling task.
  • Perform the unit testing and integration testing for all the jobs. Also, perform the database check during all the schedules and loads.
  • Provide production support by monitoring the loads and resolving data issues.

Environment: Informatica 9.6.1, SAP BODS 4.2, SSIS 2012, SAP BW 7.4, Redwood

Principal Architect

Confidential- Houston, TX

Responsibilities:

  • Extracted data from SFDC (Salesforce) to Netezza staging area using Informatica cloud.
  • Extracted data from various source systems to Netezza utilizing Synchronization, Replication, Mapping tasks in Informatica Cloud.
  • Created Users, Groups and assigned roles using Informatica Cloud Administration.
  • Developed Informatica mappings to load data from Netezza staging to EDW tables (Data Mart).
  • Implemented SCD type 1 to insert new record and update the existing record.
  • Created pre-session, Post session scripts to change the proc sts cd from N to P and from P to Y.
  • Created reference Integrity check script to identify any new OPCO number from Task, event and opportunity tables and insert the same into OPCO table if identified.
  • Worked on developing ETL pipelines on S3 files on dataware house using AWS Glue.
  • Involved in preparing Test cases and Delivery sheets for the developed build.
  • Prepared Hand's on document for all the objects related to the project.
  • Performed in Unit Testing and tuned the mappings for better performance.
  • Responsible for doing the peer review of others objects.
  • Created new applications and updated existing applications in Control - M.
  • Monitoring the jobs and rectifying the issues occur during production support (Control-M).

Environment: Windows 2000/Professional, Informatica 9.0.1, Netezza, UNIX, SFDC, Control-MInformatica Cloud, AWS Glue.

Principal Architect

Confidential- Houston, TX

Responsibilities:

  • Responsible for interacting effectively with the onsite in getting the requirements.
  • Built Informatica one-to-one mappings from source systems to Netezza staging tables.
  • Developed Informatica cloud mappings to load data from Netezza staging to EDW tables (Data Mart).
  • Implemented SCD type 1 to insert new record and update the existing record.
  • Involved in preparing Test cases and Delivery sheets for the developed objects.
  • Data validation made on sales and purchase orders data comparing with DataMart table and the data pushed to 1010data.
  • Create scripts using UNIX shell for loading data from Netezza database to 1010data through Tenup scripts.
  • Creation of Queries in UNIX for loading data in a single file or multiple files as chunks for one small and huge table with 2-4 billion rows of data.
  • Extensively worked on TLRFs and DTS.
  • Pushed/Loaded data from multiple source systems (STS, SUS, SAP) to Netezza and from Netezza to 1010data with data validation.
  • Daily calls with the onsite team and client validation for effective performance of the tasks
  • For Manual and automated validations, developed TenupScripts and TendoScripts.
  • Monitored the jobs scheduled in the Control-M scheduling tool.

Environment: Informatica 9.0.1, Netezza, UNIX, 1010data, Control-M, SFDC, Informatica Cloud.

Informatica Tech Lead

Confidential - Lisle, IL

Responsibilities:

  • Responsible for interacting effectively with the Client in getting the SQL SERVER reporting requirements.
  • Created reports using SSRS for internal use for various departments such as sales, Finance, marketing and Client Operations
  • Extensively used the SSRS to create various types of simple and complex reports
  • Created Query Calculations and Layout Calculations.
  • Created various types of reports like Parameterized reports, Linked reports, Snapshot reports, Cached reports, Ad hoc reports, Drilldown reports, Drillthrough reports and Subreports.
  • Created new reports in order to improve the performance, look and feel of the report
  • Developed Dashboards, summary and detail reports for the internal Customers and Partners using SSRS
  • Tested the reports for data accuracy, formatting, and performance.

Environment: SSIS, SSRS, MS SQL SERVER 2005, Control-M.

Informatica Tech Lead

Confidential - Lisle, IL

Responsibilities:

  • Visited the client to understand the existing process, documents, use cases and gathering the necessary requirements and giving the knowledge transfer to the offshore resources for developing the tasks.
  • Created Detail Design Documents and Mapping Specifications based on the Design Review sessions. Mapping Specifications where passed to offshore for development.
  • Designed ETL process for extracting data from Heterogonous source systems, transforms and load into ODS
  • Was responsible for code review of Informatica process and plugging maps to the common flow for integration testing
  • Developed common utility Informatica maps, B2B Canonical Maps and ODS Real time load Maps
  • Point of contact for all migration of Informatica code from QA to PROD environments.
  • Documented the Processes like Software Test Plan, Software Test Specification, Unit Test and Deployment Request Documents.
  • Designed and developed Informatica web services interface to allow ESB to communicate with Informatica Batch workflows
  • Handled heterogeneous source systems like Sqlserver, XML files, Flat files, and comma separated value files.
  • Confidential planned to go live with many applications and successfully could go live with Business-2-Business (B2B) Process in September 2011 and our team was responsible for making it possible.
  • Managed and delivered many projects/tasks and met the deadlines on time by actively involving with the resources at hand.
  • Going forward our offshore Team will be responsible for giving support to the Production environment 24X7.

Environment: Informatica 8.6.1, Web services, SQL Server, XSD, QNXT, AS400, Control M, Serena version control.

Informatica Tech Lead

Confidential

Responsibilities:

  • Frequent visits to the client place was to understand the existing process, documents, use cases and gathering the necessary requirements and giving the knowledge transfer to the offshore resources for developing the tasks.
  • Was responsible for delivering the tasks assigned by the client on time and was the single point of contact for the onsite and offshore resources.
  • HMSA is planning to go live with MSRP Extracts UCD Extracts and comply with the standards of Blue Cross Blue Shield Association (BCBSA) and my team was responsible for making it possible.
  • Worked on the mapping design specs and implementation process with the client.
  • Was involved in the deployment process from QA to PROD environments.
  • Documented the Processes like Unit Test Plan, Integration Test Specification and Deployment Request Documents.
  • Handled heterogeneous source systems like Sqlserver, XML files, Flat files, and comma separated value files.
  • Completed the development phase and all processes are migrated from Informatica Integration services to SSIS, SSRS as per recent management decision.

Environment: Informatica PowerCenter 9.1, SQL Server 2005, Microsoft SQL Server Management Studio Express 9 and Windows 2000.

We'd love your feedback!