To pursue a career that can utilize my potential and knowledge for completing projects on time and within budget and which ensures professional and personal growth.
- Over 7 years of experience in IT industry involved extensively in Software Development Life Cycle (SDLC) projects comprising of requirement capture, analysis, estimation, designing, coding, testing, and production support.
- Possesses Strong domain knowledge of Commercial Utility, Healthcare, Banking, Manufacturing and Research services.
- Extensive experience in Data Warehousing using Informatica Power Center 9.1 and lower versions
- Experience using databases such as Oracle, DB2 UDB, MS Access, PL/SQL, SQL Plus, SQL Server 2005/2008, Netezza, Sybase, etc. Strong Experience on database interfaces like PL/SQL Developer, DB visualizer, SQL Plus and TOAD.
- 6 years of experience in IBM Infosphere Information Servers 8.5/8.1/8.0.1 and Ascential datastage 7.x/6.X/5.2 both Enterprise and Standard edition using Components like Data Stage Designer, DataStage Manager, DataStage Director and DataStage Administrator.
- Installation, configuration, setup and Administration of DataStage Server.
- Extensively used SQL and PL/SQL to write stored procedures, Functions, Packages, Triggers, etc.
- Extensive experience in Database partitioning, query tuning, Performance tuning (Explain Plan) of Reports and troubleshooting Oracle Application issues on OLTP and DSS systems.
- Decent Knowledge on Report generation using Business Objects, Cognos, Datastage etc.
- Extensive knowledge of Data analysis, Data quality, Data Conversions, Data integration, Data Mining and Data Migration with specialization in Informatica Power Center.
- Extensively followed the advanced concepts of Ralph Kimball and Bill Inmon Methodologies.
- Exposure in using Star/Snowflake Schema Design
- Extensive experience in using Informatica to design and develop complex mappings using various transformations
- Experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files. Written number of shell scripts to run various batch jobs.
- Worked with Business Analysts and Business Users for understanding the requirements with Excellent analytical, and problem solving skills.
- Preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
- Worked with different Data Sources like Flat files, COBOL files, CSV and XML files.
- Worked extensively with the Scheduling tools like Tivoli and Autosys.
- Good exposure in Software quality process (QMS) consists of all the phases.
- Expertise in Defining, creating, documenting, verifying and executing unit test cases, test data, test scenarios, Test plans.
- Worked for Onsite / Offshore Model projects.
- Excellent analytical, self-motivating and hardworking attitude.
- Willing to learn and adapt to new challenges.
- Master of Computer Applications
Confidential,CA, USA. Jun’09 – Jul’12
Project : Edison Smart Connect, MRTU, ERP
Role : ETL Developer
Project Details: –
Southern California Edison provides power to most of SoCal resident. IT&BI projects actively implement to enhance, improve the IT infrastructure of Southern California Edison. The role of IT&BI goes beyond the traditional Information Technology “service provider.” Many of the innovative ideas and projects that shape the company’s future and move SCE forward are dependent on technology. IT&BI employees are at the heart of these projects, collaborating, designing and executing technology solutions that are transforming our industryResponsibilities:
- As a Senior ETL Consultant, created the mappings, coordinating with offshore team, code reviews, set-up the ETL environment, planning, design, accountable for doing designs, development and Implementation support as required.
- Performed Project Requirements Gathering, Requirements Analysis, Design, Development, Testing for the ETL, Data warehousing and reporting modules of the project.
- Developed complex Informatica mappings to load the data from various sources using different transformations such as source qualifier, connected and unconnected look up, expression, aggregator, joiner, filter, normalizer, rank, router, etc.
- Performance tuned Informatica mappings, ETL processes, SQL and database.
- Extensively used Informatica to load data from flat files, CSV files, COBOL files
- SQL tuning using hints.
- Load balancing of ETL processes, database performance tuning and capacity monitoring.
- Analyzed existing system and developed business documentation on changes required.
- Used UNIX to create Parameter files and for real time applications.
- Monitoring workflows in workflow monitor and providing batch support and fixing failures.
- Involved in migration of code from dev to QA and QA to Prod.
- Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
- Worked on SAP objects like Asset Management, Order to cash, Material master, Vendors, Recipes, Purchase contract information, Inspection plans, Customer specs and Purchase information records
- Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
- Developed a complex mixed structure load file for LSMW standard programs using UNIX scripts.
- Detail system defects are created to inform the project team about the status throughout the process.
- Used Update Strategy to insert and update data for implementing the Slowly Changing Dimensions (Type 1 SCD and Type 2 SCD) logic in mappings.
- Developed jobs in Data Stage Designer for global region specific Data Transformations and requirements.
- Developed Re-Usable Transformations and Re-Usable Mapplets.
Environment: Informatica 9.1/8.6, Oracle 11g/10g, Flatfile, Unix, Windows XP, TOAD, SQL*PLUS, ERWIN, Business objects,Datastage,Tivoli, Unix Shell Scripts and Control-M, PL/SQL, SQL Developer, T-SQL, SQL Server 2008, File zilla, VSAM, JCL, Mainframe.
Confidential,MN, USA. Jan’ 08 – May’ 09
Project : Membership Data Warehouse (MDW)
Role : ETL Developer
Project Details: –
The Membership Data Warehouse (MDW) Program is a single trusted source of cross-regional membership data to support all membership-related data marts and reporting functions by providing an accurate, up to date (within 24 hours), view of core membership information from across all regions, that in the future will help reduce the time required to provide accurate membership data to the actuarial and pricing functions and thus reduce the time required to generate premium rates.
- Involved in design, develop, maintain and enhance ETL processes.
- Worked on Informatica Tools-Source Analyzer, Warehouse Designer, Mapping and Designer.
- Involved in creating sessions and workflows to load the data into the Target Database.
- Scheduled and monitored tasks using Workflow Monitor.
- Code promotions to UAT and production Environments after User Sign off.
- Unit testing of the developed Code.
- Involved in UAT support, Interacting with the users and resolving issues that arise during User Acceptance Testing.
- Setting up the environment for Datastage Designer, Monitor and Administrator.
- Prepared Source to Target documents and other documents which help understand the Data warehouse, business rules.
- Fixed existing bugs in the ETL code, SQL code used in extract jobs, other mapping errors.
- Ensuring quality standards are maintained for all deliverables.
- Status reporting and escalation of issues to the Project Manager.
- Used Parameters and variables to design Informatica ETL processes.
- Added new jobs (Utility, new sources) to the existing process.
- Used DataStage Version Control for change management and elevation of jobs to the higher environments from development.
- Data profiling of sources to understand the source data and the rules that have to be applied to make the ETL work.
Environment: Informatica 8.6/7.1,Datastage, Oracle 10g, SQLserver 2005/2008, Windows XP, TOAD, SQL*PLUS, ERWIN, Shell Scripts and Control-M, Cognos, PL/SQL, T-SQL
Project : CSDW
Role : ETL Developer
Project Details: –
Cox Communications is a privately owned subsidiary of Cox Enterprises headquartered in Houston, TX and provides digital cable television and telecommunication services in the United States. I worked with Billing Data Mart and mappings related to it. I was also actively involved in performance tuning of previously created mappings, scheduling of daily loads and weekly roll ups. The sources included Oracle, Flat Files and DB2. The Data Mart was built on Oracle. I worked with a team on the entire ETL process and development of the data mart using Informatica PowerCenter.
- Participated in the review and approval of the technical transformation requirements document.
- Used technical transformation document to design and build the extraction, transformation, and loading (ETL) modules.
- Performed source data assessment and identified the quality and consistency of the source data.
- Extensively worked withRepository Manager, Designer, Workflow Manager and workflow monitor.
- Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
- Developed complex mapping in order to implementSlowly Changing Dimension(SCD).
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
- ØUsed Workflow Manager forWorkflow and Session Management, database connection management and scheduling of jobs to be run in the batch process.
- Used most of the transformations such as theAggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression and lookups(connected and unconnected) while transforming the Sales Force Data according to the business logic.
- Used and developedShell Scripts used for Pre and Post Session commandsfor the developed Mappings and scheduling.
- Worked with pmcmd command line program to communicate with the Informatica server, to start, stop and schedule workflows.
- Performed SQL tuning using explain plan.
- Extensively usedSQLand PL/SQLScriptsand worked in both UNIX and Windows Environment.
- Fined tuned ETL processes by considering mapping and session performance issues.
Environment: Informatica 8.1/7.1, MS Project, MS Office, CICS, DB2 8.2.1, SQL, JCL, VSAM, IMS, TSO/ISPF, SYNCSORT, IDCAMS, MVS, ChangeMan, Platinum, Xpediter, DumpMaster, SPUFI, SOA, QMF, File-AID, FTP, Abend-AID, IBM 3270.
Confidential,India Jan ‘06 – Dec ‘06
Role: Informatica Developer
VS Technologies is an IT services company providing consulting services to many sectors of the industry for the past ten years. This project included designing a system to monitor the inventory of the Client organization.Responsibilities:
- Analysing the requirements and preparing design documents.
- Code online and batch programs.
- Provide 24/7 support the application.
- Prepared Unit / Integration test plan, test data, execute the Unit, Integration and regression test cases.
- Prepare the Status reports.
- Participate in weekly status meeting with client and manager.
Environment: Informatica 7.1, Oracle 9i, SQLServer 2000, Windows XP.