- Over 6+ years of IT experience in Design, Develop, Test, Analyze and Maintain various software applications
- Involved in different phases of software development
- Extensive working experience in different sectors like Banking, Logistics and Communications
- Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, OBIEE
- Extensive experience in using different components of Informatica Power center
- Experience on various relational databases like Oracle 11g/10g, MS - SQL Server 2005/2008, DB2 and Teradata
- Good understanding of Star and Snowflake Schema, Dimensional Modeling, E-R modeling, Slowly Changing Dimensions(SCD) and data ware housing concepts
- Working experience in tuning the mapping to enhance performance
- Expertise in SQL/PLSQL programming, developing and executing Packages, Stored Procedures, Triggers, Table Partitioning, and Materialized Views
- Experience in UNIX Shell Scripting for automation batch and ETL jobs
- Experience in Scheduling the Scripts using Autosys and UC4 Schedulers
- Involved in all aspects of ETL- requirement gathering, coming up with standard interfaces to be used by operational sources, data cleaning, coming up with data load strategies, designing various mappings, developing mappings, unit testing, integration testing, regression testing and UAT in development
- Ability to perform individually and in a group with good interpersonal and strong communication skills
ETL Tools: Informatica Power Center 9.5/9.1/8.6/8.1/7.1 , Power Exchange 9.5/9.1 /8.6/8.1.
OLAP/BI: Cognos, Business Objects 5.0/6.5, OBIEE 10.1
Data Modeling: Erwin 4.0, Star-Schema Modeling, FACT and Dimension Tables
RDBMS: Oracle 11g, 10g, 9i, MS SQL Server 2005/2008, Teradata, DB2
Programming: SQL, PL/SQL, T-SQL, HTML, DHTML, XML, UNIX, Shell Scripting, Visual basic.
OS: Windows 2003/2008, HP-Unix, Linux
Others: MS Project, VISIO, SQL Developer, TOAD, MS Office
Confidential, Jersey City, NJ
Sr. ETL Developer
The project involved retiring the legacy ODS database and build a new centralized data repository called Enterprise Data Hub (EDH). This involved integrating existing source systems containing transaction applications, reference data applications, Flat file sources into EDH and also integrate new source systems that go live within the bank.
The project also involved making sure all the downstream feeds from ODS are seamless transitioned to EDH ensuring there is minimal downstream impact. Existing OBIEE Reporting layer also had to be built on EDH and all existing reports were to be migrated based on new schema structure.
- Involved in understanding existing reporting requirements and downstream data feed requirements
- Work with business/data analyst to finalize data requirements and pass it on to Data Architecture team
- Work closely with Data modeling team to finalize the staging area and base dimension and fact tables
- Perform source system analysis and Target database structure analysis
- Create Data Mapping Specifications from Source to Staging and Staging to EDH
- Design ETL mappings to extract, cleanse, transform and load into Target database.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, depending on data load requirements
- Created shell scripts to kickoff Informatica workflows.
- Performance tuning was done at the functional level and mapping level.
- Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews
- Implemented Slowly Changing Dimensions.
- Used Autosys as Job Scheduling tool to schedule Informatica jobs.
- Involved in Unit Testing, Integration Testing and Performance Testing of ETL.
- Building Data Mart to suffice the business requirements.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
Environment: Informatica PowerCenter9.1, Oracle 11g, DB2, Linux, Autosys & Unix/Perl.
Confidential, Herndon, VA
Loan Accounting Initiative (LAI) was implemented at Confidential to handle end-to-end Loan accounting system. This was implemented inside an existing Sub Ledger Systems (SLS) which handles the accounting events into individual debits and credits, maintains sub ledger balances and integrates with the General Ledger.
- Worked on Data Acquisition, Data Induction, Data Transformation and Data Validation for Confidential Loan Accounting Initiative Legacy Systems.
- Worked on ETL - Informatica/Unix portion of the Loan Accounting Initiative Project that Performs Technical Validation, Business Data Validation for the Incoming (Inbound Processor) and Outgoing (Outbound Processor) accounting data into and from LAI systems.
- Analyzed and understood business and customer requirements by interacting with Business Analysts, Data Modelers and Subject Matter Experts (SME).
- Wrote and reviewed documents like Functional Specifications (FS), Solution Specifications (LAI - SSD), ETL Specifications.
- Work in the agile environment with frequently changing requirements and participated in daily Scrum meetings and daily status calls.
- Involved in creation of Informatica mappings to build complex business rules to load data using transformations like Source Qualifier, Expression, Aggregator, Connected and Unconnected Lookups, Filter, Router, Rank, Mapplet Input, Mapplet output, Update Strategy, Normalizer, Java, Stored procedure, and Sequence generator transformations.
- Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
- Extensively worked on Mapplets, Reusable Transformations, and Worklets there by providing the flexibility to developers in next increments.
- Extensively worked on UNIX shell scripts to handle pre and post session tasks and used Unix CHECKSUM Property during Technical and Business Data Validation.
- Creation of sessions and workflows according to the data load in to different systems.
- Created the Autosys JOB Creation documents and send them to Command Center (CC) team to schedule Informatica Jobs and Unix Scripts accordingly.
- Extensively worked on Performance tuning of targets, sources, mappings and sessions.
- Worked on creation of Oracle Tables, Views, Materialized Views and Synonyms.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Extensively worked on Informatica Code Migration processes across the DEV/QAE and PROD environments by maintaining Version Control (Check In/Check Out) using TortoiseSVN tool.
- Implemented the Error Records Handling mechanism Using Event Processing tables.
- Involved in different phases of testing like Unit, Functional, Integration and System testing.
- Implemented SCD Type 1 and SCD Type 2 methodologies in ODS tables loading, to keep historical data in data warehouse.
- Creation of Review documents for specification document and test cases.
- Troubleshoot loading failure cases, including database problems.
- Prepared documents like Deployment Doc, Estimation Reports, Run Book Doc, Tractability Reports, Development tracking report and weekly status reports (WSR).
- Involved in scheduling the jobs in Autosys using UNIX scripts.
Environment: Informatica Power Center v 9.5.1, Oracle 11g/10g, Autosys Scheduler, Windows-XP (Client), Informatica servers on Unix (Solaris), WinSCP, Putty, Tortoise SVN, TOAD.
Confidential, Colorado Springs, CO
Confidential Corporation is one of the leading players in logistics industry and a leader in transporting shipments (including freight) all across the globe through various modes of transportation. The operational performance metrics data mart supports operational dashboards, which are used by the executives to make operational decisions on a daily basis.
- Informatica Power Exchange has been used to read the Oracle database logs to capture the changes in the data. The changes are then propagated to a staging area through a “Pull” mechanism using Informatica Power Centre & Power Exchange connector.
- Involved in designing and developing logical & physical data models using Erwin to support the operational reporting applications.
- Performed peer code reviews to ensure compliance with ETL standards.
- Enhanced system performance by optimizing and tuning database objects, reports and ETL processes.
- Assisted the team members in designing & developing ETL mappings and workflows.
- Tuned Existing Oracle scripts for better performance.
- Coded SQL Scripts to create the Development Database, Testing Database, Production Database, including Tablespaces, added datafiles to tablespaces, managed control files, Rollback segments creation, Users, Synonyms, Roles, profiles, Privileges.
- Developing the Informatica mappings, executing sessions, and validating the results.
- Developed SQL and PL/SQL codes for various procedures, functions, and packages to implement the business logics in an Oracle database
- Managed Metadata associated with Informatica. Queried the metadata for reporting purposes.
- Worked on the complete Life cycle of Business Intelligence project with focus on Extraction, Transformation and Loading of data using Informatica Power Center.
- Developed several complex Oracle Scripts.
- Strong in using workflow tasks like Session, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-sessions and Pre/Post commands.
- Created, updated and maintained technical documentation, Unit Test plans and release notes.
- Worked as a backup Admin handled production support issues and resolved issues during Project deployment.
Environment: Informatica Power Center 8.6, Informatica Power Exchange 8.5, Oracle 11g, DB2, UNIX, Windows 2000/XP, FTP Voyager, Quality Centre & Toad
Confidential, Kansas City, MO
Confidential offers a comprehensive range of wireless and wire line communication services bringing the freedom of mobility to consumer, business and government users.
- Extensively used Informatica 8.1 to load data from sources involving Oracle, SQL Server and Flat files to Oracle Database.
- Involved in the Extraction, Transformation and loading of the data from various sources into the dimensions and the fact tables in the Data Warehouse.
- Created reusable Transformations and Mapplets and used them in various mappings.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created Informatica mappings with PL/SQL procedures to build business rules to load data.
- Most of the transformations were used like the Source qualifier, Aggregators, Connected & Unconnected lookups, Filters & Sequence.
- Assisted in Generating reports using the report-generating tool, Business Objects 6.0
Environment: Informatica 8.1, Business Objects XI, Oracle 11g, SQL Server 2005, Flat files, SQL, PL/SQL, Windows NT
Confidential provides IT services to major financial institutions in India. The project involved building a BI application for a major banking client using Oracle BI apps.
- Participated in discussions with clients to better understand business requirements
- Created forms for new policy entry details.
- Responsible for maintaining Policies as per the customer requirement
- Wrote PL/SQL Stored Procedures for data extraction.
- Creation of Reports that allows the user to retrieve complete information as required for the financial status like monthly reports, day to day transaction etc.