Informatica/oracle Developer Resume
SUMMARY
- Over14+ Years of total IT experience and technical proficiency in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL processes for clients in Financial and Manufacturing domains.
- 7+ Years of experience in using Informatica PowerCenter and 3+ Years of experience in Talend DI tool.
- Experienced in the field of development, testing and support of Data Integration and Data Warehouses projects. Extensively worked in Informatica PowerCenter, Oracle and UNIX and Talend Data Integration tool and have an exposure to Tableau 10.2 and Hadoop ecosystem.
- Have experience in interacting with various stakeholders such as Data Architect, Database Administrators, Business Analysts, Business Users and Application Developers.
- Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica PowerCenter and Talend DI tool.
- Created complex mappings in Talend using components like tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tDBinput, tFTPconnection, tFTPget, tsendmail, tflowmetercatcher, tjavarow, tFilelist, tFilecopy, tFTPclose etc. and have created various complex mappings.
- Advance Knowledge, excellent concepts and experienced in decision support system (DSS), querying and analysis, Talend and Data Warehousing.
- Extensive experience in developing strategies for Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouse and Data Marts.
- Created derivations and business rules to be used by ETL for mapping source data for the population of Data Warehouse and Data Marts.
- Well versed with Talend Big Data, Hadoop, Hive and used Talend Big data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
- Expertise in extracting data from various sources into Hadoop Distributed File Systems (HDFS).
- Experience in migrating data into Snowflake on AWS using Talend.
- Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica PowerCenter.
- Extensively made use of all stages Aggregator, Sort, Merge, Join, Change Capture, Peek stages in Parallel Extender job.
- Strong Knowledge of Data Warehouse architecture in Designing Star Schema, Snowflake Schema, FACT and Dimensional Tables.
- Experience in optimizing Informatica mappings and sessions for performance.
- Working knowledge of Informatica Intelligent Cloud Service.
- Efficient in writing stored procedures in PL/SQL to manage the data flow.
- Worked on creating reports and workbooks in Tableau 10.X. knowledge on Hadoop ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node. Knowledge on Hadoop distributions from Cloudera.
- Experience in preparing the ETL Design Documents like High Level and low - level Design documents
- Excellent experience in Relational Database (RDBMS), Oracle Exadata, Oracle 11g,10g, Microsoft SQL Server.
- Involved in Logical and Physical Design, Backup, Restore, Data Integration and Data Transformation Service.
- Excellent communication skills and active team player and as an individual with self-motivated, fast learner, good listener and very flexible in maintaining work schedule. Ability to work overtime if required.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter 10.X/9.X, Informatica Intelligent Cloud Service, Talend Studio 7.X, Talend Big Data
Hadoop Ecosystem: HDFS, Hive, Pig, Sqoop
Language & Scripting: PL/SQL, AWK & Sed, Python, Core Java
Reporting Tools: Tableau 10.X
Databases: Oracle(12C) Exadata, SQL Server 2017, Snowflake
Scheduling Tool: DAC, Informatica Scheduler, IBM Tivoli
Operating Systems: UNIX, Windows
Other Tools: Jira, Toad
PROFESSIONAL EXPERIENCE
Confidential
Informatica/Oracle Developer
Responsibilities:
- Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from source system (oracle) by incorporating business the rules using different objects and functions that the tool supports.
- Design Escheatment rule engine and code, it is using stored procedures. Design method to ingest ODS data into staging tables.
- Write different stored procedures in Oracle to implement business logic for dormancy period calculation.
- Tuned the Informatica mappings for optimal load performance.
- Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
- Used features like pre-session/post-session commands/SQL, fine-tuned databases, mappings, and sessions to get optimal performance.
- Prepared post process data validation scripts.
- Modified existing mappings for enhancements of new business requirements.
- Involved in Dimensional modeling to Design and develop STAR Schemas using ERwin to identify Fact and Dimension Tables.
- Write stored procedures to send data for reporting purpose to the front-end Java application.
- Work on developing the Informatica mapping for processing the huge amount of data in the form of flat files and database tables using different rules.
- Optimize existing mapping / sessions for performance.
- Work closely with DBA to enhance the performance of report generation process.
- Used UNIX shell scripts to FTP the source files from EDI servers using the config files.
- Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
- Involved in Reviews & Unit Testing of Mappings and Workflows.
Environment: Informatica 10.2, Oracle Exadata, SQL Developer, UNIX.
Confidential
ETL/Talend Developer
Responsibilities:
- Build Dimensional Data Model to house data from various sources.
- Design and Develop Talend ETL jobs for loading data to the new Data Mart from existing Everest.
- Worked on SDI (standard data ingestion) to load into Greenplum and usedfollowing Talend components to develop job (prejob, tjava, tsetglobalvar, tfileinputexcel, tfileoutputdelimited, tfileinputdelimited, tcontextload, tjavarow, tloop, trun, toracle connection, tdie,twarn,toracle input,tfilelist,tfileexists,tfilesystem,tfilecopy,tfiledelete,tfilereplicate,tflowtoiterate,thashinput,tparallelize,tfilerowcount,tsendemail,tmap,tfiletouch,tsleep,tfixedflowinput,tbufferoutput etc)
- Talked to end-user to resolve any ingestion issue related to data or gaps
- Worked on gathering business requirement session with end user
- Extensively worked with DBA's and Platform team to resolve issue with Software and different types of connection issues
- Extensive knowledge and understanding of Dimensional modelling like Star schema, Snowflake schema along with knowledge on relational Database
- Leverage existing exception handling, logging framework.
- Help marketing and sales team in getting the required BI reports for their marketing strategies.
Environment: Talend Data Management Platform, SQL Server, UNIX, IBM Tivoli
Confidential
ETL/BI Developer
Responsibilities:
- Worked on different projects using Informatica, Talend, Tableau, OBIEE, Oracle, OBIA 7.9.6.3 and UNIX. I was primarily responsible for designing and developing ETL jobs using different ETL tools and Oracle PL/SQL. Additionally, I worked on some of the report development using Tableau.
- Involved in all phases of SDLC, requirement, design, development, testing, migration, and support for production environment.
- Extensively worked with various Lookup caches like, Static cache, Dynamic cache, and Persistent cache. Tuned the Informatica mapping and session for long running jobs. Worked on scheduling workflows with DAC.
- Created and modified different PL/SQL objects to efficiently manage the data flow.
- Worked on ETL tasks for the ingesting diverse set of files into central repository using Talend Open Studio.
- Create different SCD type I and type II mappings for loading of dimension tables in Informatica as well as in Talend.
- Design, develop, and implement reporting solutions in Tableau while maintaining expected productivity and quality expectations
Environment: Informatica PowerCenter, Talend Open Studio,Tableau, Oracle, UNIX, OBIEE, DAC
Confidential
ETL Informatica Developer
Responsibilities:
- Designed end-to-end ETL framework for the Gateway.
- Prepare the HDL (High level Design) and LLD (Low Level Design)
- Worked with Informatica power centre, Netezza, oracle 11g, SQL Server Flat files and other sources to integrate data and worked in Data warehouse Development life cycle.
- Key Player in the team, assigned with designing, extracts/mappings having millions of data and complex logics.
- Technical consultant for the team to provide optimal solutions, design reviews, code reviews and for any Informatica related issues.
- Worked with Informatica power centre, Netezza, oracle 11g, SQL server, Flat files and other sources to integrate data and also worked in Data Warehouse Development life cycle.
- Worked with different mappings like getting data from Oracle tables, SQL Server, and flat files, getting data from dimension and fact tables in 2 separate data marts and loading to oracle target tables. Also used pre-loaded stored procedures.
- Used Informatica features to implement Type I, II changes in slowly changing dimension tables and developed complex mappings to facilitate daily, weekly and monthly loading of data.
- According to transformation logic, we used various transformations like sorter, aggregator, lookup, Expression, filters, Update Strategy, Joiner, Router etc in Mappings.
- Configured the Reusable command task for sending the triggers, invoking different scripts and Reusable Email tasks for sending failure mail along with session details and appending logs with it and success mails. Also used Event wait, Event Raise, Scheduler options in the workflows.
- Designed and developed many Oracle stored procedures, triggers, Views, Indexes queries for enhancements and maintenance of various database modules
- Designed processes for the framework, which include processes for Data Cleansing and Quality checks.
- Created low level designs for ETL processes, which are generic in nature and completely metadata driven using Informatica.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder, also take care of the housekeeping scripts
Environment: Informatica Power Centre 8.6.1, SQL Developer, Shell Scripts, Oracle 10g
Confidential
ETL Lead Developer
Responsibilities:
- Play key role in writing the shell scripts to automate the EACM application.
- Used Sed and AWK extensively in the shell scripts
- Prepared Unit test cases and performed Unit testing for validating the data to meet expected results.
- Re-engineered on existing Mappings to support new/changing business requirements.
- Used Mapping, Sessions Variables/Parameters, and Parameter Files to support change data capture and automate workflow execution process to provide 24x7 available data processing.
- Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.
- Developed Informatica Mappings to populate the data into dimension and Fact tables for data classifications to end developers.
- Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
- Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions (SCD).
- Created various tasks like Pre/Post Session, Command, Timer and Event wait.
- Prepared SQL Queries to validate the data in both source and target databases.
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
- Prepared the error handling document to maintain the error handling process.
- Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
- Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.
- Analyse production issues with Data Stage load and work with DBAs, Unix Administrators and FTP teams to resolve the issue in a timely manner.
- Involve in the creation of DDL and DML scripts for stage and target tables as per the requirements.
Environment: Data Stage, DB2, Shell Scripts, AIX, XML, Core Java