Informatica Developer Resume
SUMMARY
- Organized and goal - oriented with over 8+ years’ experience in IT Design, Development, Testing and Production Support as Team lead with strong Leadership qualities.
- Extensive experience in System Analysis, Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in Banking and Insurance domains.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Proficient in the Data Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, Netezza, MS SQL Server, DB2, Teradata, AWS, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manager).
- Having good knowledge on Informatica Power Exchange, IDQ to build Data Integration
- Experience in Developing Informatica Power Center mapping, Sessions and workflows.
- Experience in Administration activities like Creating and Managing Repositories, Users, User Groups, Folders and deployment groups.
- Proficient in using Informatica power center to build Workflow Solutions, Extract, Transform and Load (ETL) solutions for Data warehousing applications.
- Good understanding of life cycle Implementation for data warehouse.
- Good Knowledge and experience in ETL tool Ab Initio in using GDE Designer, Co-Operating System and components.
- Thorough knowledge in EME, check-ins, checkouts, command line interface, air commands and Dependency analysis.
- Experience in Ab Initio EME/Sandbox to implement version control and impact analysis to manage various projects across the organization.
- Experienced with different Relational databases like Teradata and DB2.
- Worked with the tools in Hadoop Ecosystem including HDFS, MapReduce, Sqoop.
- Knowledge on Hadoop Ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Experience in migrating the data using Sqoop from HDFS to Relational Database System and vice-versa.
- Designed and Developed Teradata BTEQ, MLOAD, FastLoad scripts to Load data from Load Ready Files to Teradata Staging area or to Data Warehouse, data marts for specific reporting requirements or to downstream applications. Also created data export scripts using utilities FastExport, TPUMP and TPT.
- Designed and Developed Experience in PL/SQL Programming (Stored procedures, Triggers, Packages) using Oracle (SQL, PL/SQL), SQL Server and UNIX shell scripting to perform job scheduling.
- Scheduled and Automated the Teradata SQL Scripts in UNIX using Korn Shell scripting in Putty and worked on File Transferring from Windows Server to UNIX and Mainframes Servers using UNIX FTP Commands.
- Worked in Automation tool like Autosys for scheduling Teradata data mover jobs and fixed any scheduling changes.
- Created and Automated Daily, weekly, monthly, Profitability analysis reports for Credit cards statements using SAS procedures like PROC FREQ, PROC MEANS, PROC SORT, PROC PRINT and PROC REPORT.
- Good experience in Production Support, identifying the root causes, Troubleshooting, scheduling changes and fixed the issues to meet the SLA on time.
- Good Communication and interpersonal skills. Proficiency in prioritizing and multitasking to ensure all tasks are completed on time.
TECHNICAL SKILLS
Programming Language: UNIX Shell scripting, COBOL, JCL, VSAM, SAS
Tools: Autosys, TSO/ISPF, INFOMAN, CHANGEMAN, File Aid, File Manager, SPUFI, SQL, MAXIMO, CA7, Endevor
ETL Tools: Informatica PowerCenter 9.x, 10.x, IDQ, Informatica BDM, Ab Initio
Databases: Oracle, DB2, Teradata
BigData Technologies: HDFS, MapReduce, Sqoop
Database Tools: Toad, Sql Developer
Version Control: Git, CVS, SVN
Query Tools: Teradata SQL Assistant, PUTTY, SQL Developer, WINSCP
Utilities: BTEQ, Fastload, Multiload, Fastexport, TPUMP, TPT, ARCMain
AGILE Tools: JIRA, RALLY
GUI: MS Office Suite, Visual Basic 6.0
Operating Systems: Windows, UNIX, MS-DOS, IBM S/3, MVS/ESA
PROFESSIONAL EXPERIENCE
Confidential
Informatica Developer
Responsibilities:
- Develop and support appropriate ETL routines and mappings using Informatica power center based on the requirements and technical design.
- Develops complex applications that combine data from various sources, tailored to specific business needs.
- Coordinate and develop process automation/job scheduling, perform testing and defect analysis.
- Work with Business Intelligence leadership team to develop and enhance ETL development process to meet the needs of the business and reduce risk.
- Read data from flat files, XML, Relational tables
- Participate in design sessions with ETL developers, report developers, DBAs and Business Analysts.
- Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
- Applied the rules and profiled the source and target table's data using IDQ.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Used Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
- Participate in and facilitate design, code, and unit test review sessions as needed.
- Responsible for choosing proper partitioning methods for performance enhancements.
- Involved in unit testing of mappings, mapplets also involved in integration testing and user acceptance testing.
- Scheduling the sessions to extract, transform, and load data into the warehouse database on business requirements.
- Created sessions, database connections Responsible for scheduling, monitoring and supporting production ETL jobs.
- Managing production support activities and responding to the user queries within SLA.
- Involved in writing shell scripts for load data and process data.
- Used Parameter files in mappings and sessions
- Involved in creating and managing repositories.
- Involved in unit, integration and user acceptance testing of ETL Applications.
Environment - Informatica PowerCenter 10.1, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Jira, UNIX, ESP scheduling tool, WIKI, Agile, Hadoop, HDFS, Sqoop
Confidential
Informatica Developer
Responsibilities:
- Develop appropriate ETL routines and mappings using Informatica Powercenter 9.6.1 based on the requirements and technical design.
- Coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ
- Write Automation scripts and test ETL code using Automation tool Ruby on Rails.
- Develops complex applications that combine data from various sources, tailored to specific business needs.
- Coordinate and develop process automation/job scheduling, perform testing and defect analysis.
- Work with Business Intelligence leadership team to develop and enhance ETL development process to meet the needs of the business and reduce risk.
- Implementing Data Quality rules using Informatica IDQ and Powercenter.
- Using Informatica PWX for CDC and reading VSAM files.
- Participate in design sessions with ETL developers, report developers, DBAs and Business Analysts.
- Participate in and facilitate design, code, and unit test review sessions as needed Responsible for choosing proper partitioning methods for performance enhancements.
- Involved in unit testing of mappings, mapplets also involved in integration testing and user acceptance testing.
- Scheduling the sessions to extract, transform, and load data into the warehouse database on business requirements. Involved in reviewing and approving existing ETL jobs.
- Created sessions, database connections Responsible for scheduling, monitoring and supporting production ETL jobs.
- Involved in writing shell scripts for load data and process data.
- Involved in writing shell scripts to schedule, automate the ETL jobs.
- Used Parameter files in mappings and sessions
- Involved in creating and managing repositories.
- Involved in unit, integration and user acceptance testing of ETL Applications.
Confidential
Informatica Developer
Responsibilities:
- Develop appropriate ETL routines and mappings using Informatica BDM based on the requirements and technical design.
- Extracted data from different sources like Oracle, flat files, Netezza, XML files and other databases.
- Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
- Developed rules and mapplets that are commonly used in different mappings
- Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
- Used Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
- Involved in migration of the maps from IDQ to power center
- Applied the rules and profiled the source and target table's data using IDQ.
- Developed ETL routines using Informatica Power Center and created mappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures.
- SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
- Extensively used Mapping Variables, Mapping Parameters to execute complex business logic.
- Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures transformations.
- Proficient in using Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
- Contributed in delivering logical/physical data model (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
- Used ErwinR9.0, involved in the creation of Conceptual, Logical and Physical data models.
- Coordinated with quality team and involved in Data Analysis and DataProfiling and worked on data transformations and data quality rules.
- Used debugger in debugging some critical mapping by setting breakpoints and troubleshoot the issues by checking sessions and workflow logs.
- Involved in identifying bottlenecks in source, target, mappings and sessions and resolved the bottlenecks by doing Performance tuning techniques like increasing block size, data cache size, sequence buffer length.
- Developed UNIX shell scripts to create parameter files, rename files.
- Worked closely with QA and UAT team in resolving the complex defect fixes.
- Responsible for the documentation of the different processes carried out like design documents and mapping documents, SharePoint for the version control of the documents and star team for the version control of source code for all the interfaces in the Interface Project.
- Extensive experience working in an Agile development environment
Environment: Informatica BDM, Informatica Data Quality (IDQ) 9.6, Oracle 11g, UNIX, Teradata 14, ESP Scheduling tool, RALLY, Erwin