- Around 6 years of experience in system analysis, design, development, production support and maintenance of Data Warehousing/Business Intelligence (BI) projects using ETL tools like Informatica Powercenter .
- Possess strong Data Warehousing experience using Informatica Powercenter v8.x and v9.x .,IBI Data Migrator and Ab Initio.
- Good knowledge on Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Strong understanding of Hadoop architecture and HDFS.
- Worked on integrating data from various sources such as Teradata, Oracle, MySQL.
- Expertise in designing Informatica mappings involving sources and targets like Oracle, Teradata, Flat Files, CSV files, SQL Server, XML Files.
- Worked extensively with slowly changing dimensions.
- Hands on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Used Oracle Enterprise Manager to monitor and trouble shoot long running SQL queries.
- Hands on experience in writing, testing and implementation of triggers, procedures, functions at Database level using PL/SQL.
- Performed admin related activities like installation,creating folders and user accounts; starting and stopping repository and domain services in Informatica admin console.
- Experienced in UNIX scripting, FTP and networking principles.
- Worked on different scheduling tools like Informatica Scheduler, Tivoli Workload Scheduler.
- Created reports using MS SQL Server Reporting Service(SSRS) to notify the status of various Informatica jobs.
- Working knowledge on Reporting/ Visualization tools like Tableau, SSRS, Cognos.
- Developed replication jobs using Informatica cloud service.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Proficient in performance analysis, monitoring and SQL query tuning in Oracle and Informatica.
- Working experience in Agile projects.
- Working in onshore offshore model project, leading and mentoring a team of data engineers.
Databases: Oracle 9i/ 10g/11g, MySQL,Netezza, Teradata 12,DB2,Netezza
ETL Tools: Informatica Powercenter v8.x and v9.x,Informatica Power Exchange, Informatica Cloud Service,Ab Initio,IBI Data Migrator
Scheduling tools: Informatica scheduler, Autosys,Tivoli Workload Scheduler,ESP
Other tools: TOAD, Oracle Enterprise Manager, Putty, Winscp
Reporting tools: MS SQL Server Reporting services 2005,Tableau
Operating Systems: UNIX,DOS and Windows
Programming Skills: C, C++, Shell Scripting (K - Shell, C-Shell),Perl, PL/SQL, JAVA, HTML, JAVA Script, J2EE, CSS.
Confidential, Washington, DC
Senior Programming Analyst
Environment: IBI Data Migrator 8.x, LINUX, SQL Server, Oracle, JIRA, Main Frame
- Extracted data from various sources like SQL Server, Oracle and load into target warehouse.
- Developed and tested data and process flows in Data Migrator Console based on the business requirements.
- Created LINUX scripts for the tasks that cannot be accomplished with DMC.
- Involved in performance tuning of the existing data flows.
- Owned and lead the development of claims and policies related to Umbrella Line of business and loaded them successfully into IPF warehouse.
- Worked on extracting JSON and XML data and loaded them into Oracle tables.
- Presented various demonstrations on LINUX shell scripting, Performance tuning and IPF Architecture.
Confidential, Los Angeles, CA
Senior Informatica Consultant
Environment: Informatica Power Center 9.5.1 (PowerCenter Designer, Workflow Manager, Monitor, Repository Manager), DB2, Quality Center, Main Frame, Toad, Putty and UNIX
- Possess good experience working with different kinds of data sources like Excel, Flat file, DB tables.
- Responsible for design, development and maintenance of ETL processes to populate warehouse.
- Lead the database analysis, design, and build effort.
- Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations and advanced concepts like Pushdown optimization.
- Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings using Informatica Designer.
- Daily, weekly and monthly processing of the farmers EDBI applications.
- Involved in the maintenance activities of the farmers EDBI application (FDR, MLCDM, AOI, HOI, HGDM, SABER).
- Demonstrated expertise utilizing ETL tools, including Informatica and ETL package design, and RDBMS systems like MYSQL, Oracle.
- Working experience in Mainframe technologies like COBOL,JCL.
- Created scripts for performing archive and backup activities of PowerCenter Folders .
- Analyzed the source data coming from Oracle ERP system to create the Source to Target Data Mapping in Informatica.
- Performed SQL development of queries, optimization, reporting & schema creation.
- Designed maps of source system data to data warehouse models.
- Developed strategies for data acquisitions and integration of the new data into Domo's Data engine.
- Accessed Box Folders and pulled data directly from them at run time.
- Monitored performance and tuned databases to optimize for different workloads.
- Conducted Post deployment analysis on production server for the new changes.
- Presented defect report to the higher management.
- Developed Procedures and Functions in PL/SQL and fine tuned them to eliminate Full Table scans to reduce Disk I/O and Sorts.
- Using Procedure Components to invoke Oracle PL/SQL Procedures.
- Used UNIX to create Parameter files and for real time applications and developed shell scripts.
- Performed Data Masking for preserving the referential integrity of the user data.
- Performed Data Encryption on user data and client data for maintaining consistency and security.
- Involve in Unit testing and System testing of the individual.
- Analyzed existing system and developed business documentation on changes required.
- Provide 24*7 production support of ETL runs and fix defects timely.
Confidential, Denver, CO
Senior Informatica Developer
Environment: Informatica Power Center 8.1 and 9.1.0(PowerCenter Designer, Workflow Manager, Monitor, Repository Manager),Informatica cloud services, Oracle, Netezza, Teradata, Sybase, Remedy, Toad, MS SQL Server 2008, Oracle Enterprise Manager, Autosys, Oracle Exadata,Putty, Python and UNIX
- Developed complex mappings using various transformations like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy, Rank, Sequence Generator, Normalizer, HTTP, Web services etc. according to business logic.
- Created and configured source and target connections in Informatica and also prepared Source to Target Mapping documents, Low Level Design Documents.
- Developed UNIX based scripts which helped in effective monitoring of Data Warehouse jobs.
- Tested the changes/code fixes that are implemented as part of change requests/code fixes.
- Monitored the deployment activity when the specific deliverables are going into production.
- Created new extracts for external vendors and used Informatica ETLs for new workflows to move data out of multiple data sources like SEIBEL,EON,Salesforce.
- Ensured that the deployed code is working fine in production environment.
- Moved the code from One Version (Infa8.1) to Other Version (Infa9.x).
- Supported MonthlyReleases and was involved in data Validation.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Responsible for production support of Informatica jobs (24x 7).
- Identified data issues and implemented permanent fixes to avoid session failure upon their further occurrences.
- Worked on Teradata Utilities like Fast-Load, Multi Load,T-PUMP.
- Collaborated with DBA teams for fine tuning the heavy queries and optimizing Informatica power center workflows.
- Developed tabular reports using SSRS.
- Created Tableau dashboards as per requirement.
- Used pipeline partitioning like round robin, key range partitioning, hash partitioning, push-down optimization technique for performance tuning at the session level.