- More than 9 years of IT experience in Data Warehousing technology in Informatica.
- Worked extensively in design and analysis of Requirements, Development, Testing and Production support
- Extensive experience in Informatica (10.2, 10.1, 9.6) applications. Designed and developed the Workflows, Worklets, Mappings and Sessions
- Interacted with end - users and functional analysts to identify and develop Business Requirement Document (BRD) and transform it into technical requirements.
- Worked on documentation of Design requirements, Test data, Data movement schema and Mapping between applications.
- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
- Extensively worked on using the PDO (Push down optimization), CDC (Change data capture) mechanism.
- Expertise in using Teradata Utilities BTEQ, M-Load, F-Load, TPT and F-Export in combination with Informatica for better Load into Teradata Warehouse.
- Hands on experience in HiveQL.
- Designed and developed complex mappings (Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy) from varied transformation logics in Informatica.
- Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.
- Working knowledge on Cloud Platforms like AWS, GCP, OpenStack, OpenShift.
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets and PL/SQL stored procedures.
- Experience provisioning, operating and maintaining systems running on AWS
- Extensively used Slowly Changing Dimension (SCD) technique in banking application.
- Conducted Unit tests, Integration tests and Customer Acceptance tests.
- Expertise in OLTP/OLAP System Study, Analysis, E-R modeling, developing Dimensional Models using Star schema and Snowflake schema techniques used in relational, dimensional and multidimensional modeling.
- Designed AutoSys based solutions for communication of issues to technical teams.
- Production support experience. Extensive understanding of Production issues and Data issues.
- Excellent communication skills, ability to communicate effectively with different levels of management, strong analytical, problem solving skills.
ETL Tools: Informatica PowerCenter 10.2/9.6.5/8.5, Informatica PowerExchange 10.2/9.6.5/8.5, Informatica Data Quality 9.5, AWS Designer, Workflow Manager, and Workflow Monitor, IICS
Operating Systems: AS/400, AIX, UNIX, Windows 7,2008, XP
Databases: Oracle 12c/11g/10g/ 9i, SQL Server 2008, DB2, MS Access, Teradata 15/14
Data Modeling and Reporting Tools: Erwin 9.2, MS Visio 2010, TOAD, SQL Navigator, SQL* LoaderSQL Management Studio, SQL Assistance, QlikView 11.20, Cognos 8.4
Office Suite: MS Word, MS Power Point, MS Excel, MS Access.
Web Technologies: HTML, CSS, Java Script, UNIX Shell Scripting, Bash Scripting
Sr. Informatica ETL Developer
- Involved in designing and customizing logical and physical data models for Data warehouse supporting data from multiple sources on real time.
- Used Star Schema approach for designing of Data Warehouse using Erwin.
- Designed and developed Informatica mappings, enabling to Extract, Transform and Load the data into target tables.
- Extensively worked on Informatica Designer and Workflow Manager.
- Extensively worked on performance tuning of Informatica and IDQ mappings .
- Extensively used almost all of the transformations of Informatica including source qualifier, expression, filter, aggregator, rank, Lookups, Stored Procedures, sequence generator, joiner, Update Strategy and others.
- Worked with Memory cache for improving throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations .
- Developed Slowly Changing Dimension (SCD) mappings.
- Developed Sessions, Worklets using Informatica Workflow Manager.
- Extensively worked in the Performance tuning of the Programs, Procedures and Processes.
- Translated business requirements into data warehouse design.
- Customize shell scripts to run mapping in Control M
- Worked in modules with relational E/R and Dimensional data modeling in large transaction.
- Worked on technical Documentation of existing and proposed models.
- Perform analysis of large data sets using components from the Hadoop ecosystem
- Worked closely with DBA and developers during planning, analyzing and testing phase of the project.
- Running the SQL scripts from TOAD and creating Oracle Objects like tables, Views, Indexes, sequences and Users. SQL coding for problem resolution and performance optimization
- Participated in identifying scope of the project, planning and development with other team members.
- Analyzed, enhanced and maintained Data Warehouse systems using RDBMS database and Informatica tools.
- Knowledge on Informatica Cloud Real Time preferred
- Worked with various IT and management teams with the schedule.
- Prepared test cases and performed Unit testing. Created test cases and test scenarios and documented actual results; involved in system testing of overall processes; compared actual results to expected results; and wrote Test Plans and created Test Cases effectively using Quality Center.
- Worked on data cleansing and query tuning. Involved in analyzing the source data coming from different data sources like Oracle, Flat Files, XML and SQL Server and identifying data anomalies in operational data.
- Worked in the data mapping, designing phase and deploying the mappings
- Converted several Informatica workflows into Hadoop, Spark and Scala.
- Trouble shot development related issues, which arose at the time of UAT.
- Designed and Implement test environment on AWS .
- Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
- Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor.
- Used mapping Parameters and Variables to pass the values between sessions.
- Developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
Environment: IICS, Informatica PowerCenter 10.2, Informatica PowerExchange 10.1, Oracle 12c/11g, DB2, Teradata 15, MSSQL Server 2012, IDQ 9.5, Autosys, Snowflake cloud databases, AWS S3 bucket, JSON, Hadoop, Erwin 9.2, Putty,Change Data Capture (CDC), Shell Scripting, Clearcase, Putty, WinSCP, Notepad++, JIRA, Control-M, Cognos 10.
Confidential, Mount Laurel, NJ
Sr. Informatica ETL Developer
- Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate the data from the source to the target DW
- Analyzed the business requirement document and created functional requirement document mapping all the business requirements.
- Worked in Agile Methodology.
- Created logical and physical data models for the star schema using ERWIN r7
- Used Erwin to reverse-engineer and refine business data models.
- Involved in designing the process flow for extracting the data across various source systems.
- Prepared Data Architect document, Mapping Specification and Unit testing documents for ease of future maintenance and support
- Extracted data from various Relational Databases like SQL Server, Oracle, Flat Files using Informatica mappings.
- Subject Matter Expert for conversion and data mapping to Oracle
- Used HR Data knowledge, heavily participated in data conversion to Oracle Data Warehouse
- Installed and configured Informatica Power Exchange CDC and for Oracle on UNIX platform.
- Designed and developed mapping using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Router, Rank, Filter and Sequence Generator transformations.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Used Mapping Variables, Mapping Parameters in the Informatica Mappings to filter the daily data from the source systems.
- Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.
- Worked with Oracle database Developed stored procedures on Oracle and SQL server for data manipulation and data warehouse population
- Involved in writing PL/SQL code in Oracle stored procedures, functions and packages to support applications front end and back end.
- Developed PL/SQL procedures, functions to facilitate specific requirement.
- Worked on SQL Loader for bulk load of data and used SQL tuner for tuning SQL.
- Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
- Used the feature EXPLAIN PLAN to find out the bottlenecks in each Query, thus improving the performance of the job.
- Automated the batch jobs using UNIX shell scripts.
- Tuned performance on Sources, targets and mappings and SQL queries in the mappings.
- Developed sessions using different types of partitions like round robin, hash key portioning for better performance.
- Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables to check for the data consistency.
- Involved in writing windows batch scripting.
- Scheduled jobs using Autosys and monitored automated weekly jobs.
- Prepared Detail design documentation thoroughly for production support and release management department to use as hand guide for future production runs before code gets migrated.
- Also involved in conducting and leading the team meetings and providing status report to project manager
Environment: s: Informatica PowerCenter 10.1, Informatica PowerExchange 10.1, Teradata 14, oracle 11g, Flat files, Linux, Teradata SQL Assistant, MSSQL Server, Data Ladder IBM clear case, UNIX, PL SQL, Autosys, XML files, Production support, Erwin Data modeler.
Confidential, Jersey City, NJ
Informatica ETL Developer
- Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
- Worked on project documentation, which included the Functional, Technical and ETL Specification documents.
- Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
- Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
- Extensively worked on complex mappings, which involved slowly changing dimensions.
- Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
- Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.
- Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
- Debugged mappings by creating a logic that assigns a severity level to each error and sent the error rows to error table so that they can be corrected and re-loaded into a target system.
- Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
- Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
- Created the design and technical specifications for the ETL process of the project.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Worked with SQL*Loader to load data from flat files obtained from various facilities.
- Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
- Worked with the Release Management Team for the approvals of the Change requests, Incidents using BMC Remedy Incident tool.
- Worked with the infrastructure team to make sure that the deployment is up to date.
Environment: Informatica PowerCenter 9.6, Confidential DB2, Control-M, PL SQL, FileZilla, Windows, UNIX Shell Scripting, Sql Server2000/2008, UNIX, COBOL, ERWIN, Shell script, Rapid-SQL, Toad, Teradata 13, Visio, Oracle 10g, Autosys, Clear Case.
Confidential, Jacksonville, FL
Informatica ETL Developer
- Developed ETL jobs to extract information from Enterprise Data Warehouse.
- Extensive use of ETL process to load data from different RDBMS, XML and flat files.
- Used Informatica Repository Manager to create repositories and users and set permissions.
- Experience in Performance tuning of SQL queries and mainframe applications.
- Involved in Data Extraction from Oracle, Flat files, Mainframe files using Informatica.
- Debugged the mappings extensively, hard coding the test data to test the logics going instance by instance.
- Handle the migration process from Development, Test and Production Environments.
- Implemented Type 2 slowly changing dimensions to maintain dimension history and tuned the mappings for Optimum Performance.
- Developed PL/SQL Procedures at the database level that were used in the mappings through Stored Procedure Transformation
- Used Informatica Designer to design mappings and coded it using reusable Mapplets.
- Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
- Involved in unit testing and documenting the jobs and workflows.
- Set Standards for naming conventions and best practices for Informatica mapping development.
- Worked with production support systems that required immediate support.
- Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
- Created various UNIX shell scripts for Job automation of data loads
- Created mappings which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs
Environment: Informatica PowerCenter 9.5, Oracle 10g/9i, Oracle Data Integrator, SQL, PL/SQL, SQL Developer, Erwin, MS VISIO, Autosys, QC, LINUX/ UNIX.
Confidential, Atlanta, GA
Informatica ETL Developer
- Gathering requirements and implement them into source to target mappings.
- Experience in integration of data sources like SQL server and MS access and non-relational sources like flat files into staging area.
- Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.
- Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst)
- Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
- Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Check out) on an urgency through support for QA in component unit testing and validation.
- Used shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Applied slowly changing Dimensions Type I and Type II on business requirements.
- Extensively worked on performance tuning and also in isolating header and footer in single file.
- Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client.
- Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
- Used Cognos Transformer to build multidimensional cubes
- Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.
- Involved in requirement gathering, Design, testing, project coordination and migration.
- Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.
- Worked on Multidimensional Models and created Reports in Report Studio using Cubes as a data source.
- Perform analysis profiling on existing data and identify root causes for data inaccuracies, Impact Analysis and recommendation of Data Quality.
- Precisely documented mappings to ETL Technical Specification document for all stages for future .
- Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.
- Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.
- Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Responsible for studying the existing data warehouse and working on migrating existing PL/SQL packages, stored procedures, triggers and functions to Informatica PowerCenter.
- Fine-tuned ETL processes by considering mapping and session performance issues.
- Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
- Maintained the proper communication between other teams and client.
Environment: Informatica PowerCenter 9.5, Powerexchange 9.5, Oracle 7.0.2, Erwin r7, Oracle 9i, SQL, PL/SQL, DB2 8.0, MS SQL Server 2008, Flat Files, Autosys, Windows XP, UNIX, PL/SQL, SQL*Loader, TOAD, ANSI SQL
Confidential, Birmingham, AL
- Designed and developed complex mappings using Mapping Designer to load data from various sources.
- Used Pipeline partitioning and External Loader to improve Session performance, created indicator files for event driven sessions.
- Used the update strategy to effectively load data from source to target.
- Involved in unit test cases and test plan.
- Created tasks, workflows and worklets using workflow manager.
- Used pmcmd to run workflows and Autosys to automate their schedules.
- Created UNIX shell scripting for automation.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing bugs so that they conform to the business needs.
- Handled Incremental loading using mapping variables and Implemented versioning of folders in the Informatica Designer tool.
- Used Parameter files to specify DB Connection parameters for sources.
- Performance Tuning in SQL Server 2008 using SQL Profiler and Data Loading.
Environment: Informatica PowerCenter 8.1, SQL, UNIX, Shell Scripting, SQL Server 2008, Sybase, Oracle, Control-M, Cognos 8.4