- 6 years’ experience in designing, development and support of data warehouse applications using Informatica Power Centre 8.x/9.x
- Extensive hands - on experience using Informatica Power Centre 9.x/8.x to create and edit Transformation, Mapping, Task, Sessions and Workflow to extract data from various types of sources, transform according to predefined logic, and Load into Target database in required formats.
- Performance tuning in levels across mapping to database or by techniques such Partitioning or Pushdown Optimization.
- Strong in Source to Target data Mapping and CDC (Change Data Capture) using Slowly Changing Dimension Mapping, incremental aggregation etc.
- Experience in Tuning of sources, targets, mappings, transformations and sessions and SQL scripts.
- Knowledge of Partitioning Data and session partitions in Informatica.
- Good Experience and knowledge in creating Stored Procedures and Triggers using Oracle.
- Highly Skilled with the Integration of various data sources like Oracle 10g, DB2, SQL Server, Flat Files, Excel Sheets, XML and IBM Mainframe COBOL files
- Skilled in Unit Test, System Integration Test and UAT
- Experienced with Unix/Linux operations and programing for file manipulation and Bash Shell and Kernel Shell scripting, by using Putty and WinSCP.
- Practical understanding of the Data modeling concepts like Star-Schema Modelling, Snowflakes modelling, fact and dimension tables modeling of data at all the three levels: view, logical & physical.
- Experience in various features of Erwin like Forward Engineering and Reverse Engineering
- Extensively worked with OLAP tools such as Business Objects and Cognos
- Good understanding with of SDLC (System Development Life Cycle) and Project Management work flow and writing Data movement High Level Design and Low Level Design documents.
- Good knowledge in Logical and Physical Data Modeling for Data Store, Data Mart and Data Warehouse, (Star Schema, Snow Flake Schema, FACT & Dimension Tables), CDC and SCD’s Type1, Type2 and Type3 updating.
- Experienced in SQL development and database management for Oracle 10/11g, DB2, Teradata, PostgreSQL, SQL Server 2000/2005 including DDL, DML, Indexing Strategy, Data Auditing, Triggers and Stored Procedures, by using tools, such as SQL developers, TOAD, SQL Developer, Spring Source Tool, and Oracle Enterprise Management.
ETL Tools: Informatica 8.x /9.1(Power Mart & Power Center), Power Connect
Data Modeling Tools: Erwin
Databases: Oracle 11g/10g, Teradata, DB2, MS SQL Server
RDMS Load Tools: SQL *Loader (Oracle)
Business Intelligence Tools: IBM Cognos BI, Cognos Insight, Tableau, Business Objects, Data Modeling, Star Schema, Snow Flake Schema, OLTP, OLAP.
Data Analyst Tools: IBM SPSS
Languages: SQL, C, C++, Java, Python, Shell Scripting, XML, HTML
Operating System: Windows NT/2000/XP/7, Red Hat Linux, UNIX (Solaris, AIX).
Office Applications: MS: Office 2003/2007/2013 , Open Office Suites.
Other Tools: TOAD and SQL Plus (Database tool)
Confidential, Bloomington, IL
- Analyzed system architecture, studied with project work flow, and set up all the tools and accesses.
- Worked with business analysts and DBA to understand requirements of data movement, including the business rule, technical rule, volume and frequency.
- Used Spring Source Tool to query the source database (DB2) and target database (PostgreSQL).
- Used Microsoft Word and Visio to write the data Pop/Sync (population and synchronization) Low Level Design and High Level Design Documents.
- Adopted EDBLoader for data bulk load in huge volume data population and Full Comparison logic of Informatica mapping for synchronization.
- Used PostGIS to deal with spatial data for PostgreSQL database.
- Developed Informatica mappings for Salesforce data integration with logic, such as lookup, load validation and reprocess.
- Worked with BMC Middleware Management and XML Parse Transformation of Informatica to mock-up and extract data from the Queues and Messages.
- Implemented the business rules by using Expression, Lookup, Joiner, Aggregator, Filter and Update strategy Transformations.
- Loaded and queried data in Salesforce.com and captured the successful and error files of data loading.
- Recorded the error records generated due to the load failure and loaded into the Exception Table (DB2) to prepare for reprocess.
- Parameterized all the inputs, outputs, directories and connection string by using parameter file and table across the project to mapping levels.
- Composed the Shell Script to run the Workflow by Pmcmd Utility and scheduling the the jobs using Control-M
- Manipulated files in Linux using Putty and WinSCP.
- Had the Knowledge Transfer session with off-shore team and review meeting of design and code with clients.
Environment: Informatica Power Center 9.1.0, DB2, PostgreSQL, Salesforce.com, XML, Kernel Shell Script, Control-M, Lotus Notes, Windows 7 professional, Linux, Putty, WinSCP, Spring Source Tool, BMC Middleware Management.
- Designed database solutions to satisfy application (business and technical) requirements
- Implemented database solutions using available database development tools like BPM, Power Center
- Provided first level operational support for development and production database systems
- Providedwritten status reports to management regarding project status, task, and issues/risks
- Developed complexInformaticamappings using different types of transformations like Union transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations forlarge volumes of data
- UnitandRegressiontesting of theInformaticaMappings created according to the business requirements
- Used of SQL tools likeTOADto runSQL queriesto validate the data
- Created Workflows and sessions usingInformaticaworkflow manager and monitor the workflow run and statistic properties onInformatica Workflow Monitor
- DefinedMapping parametersandvariablesandSession parametersaccording to the requirements and performance related issues
- Tuned performance of Informatica code for better performance
- UsedVersion Controlprovided byInformaticaon each and every object used. In this way, each and every version of the process will be available for recovery or research purposes
- DevelopedUNIX Scriptsfor updating the control table
Environment: Informatica Power Center 9.1.0, Informatica Power Exchange 9.1.0, Business Process Modeling (BPM), SAP R/3 7.1, Oracle 11g, Mainframe, DB2, Flat files, TOAD, Visio, PL/SQL, SQL, UNIX (AIX), Unix Shell Scripting, Windows 7
Confidential, NY, NY
- Worked with DBA, data analysts to study on the legacy data and new third party data feed, and compose reports on the ETL strategic plan.
- Worked with Informatica Power Center Data Profile Option to analyze the data structure in source systems.
- Extensively work on Flat file (Delimited, and Fixed width) and MS-Excel as sources
- Using Erwin to design and reorganize both the Logical and Physical data model.
- Use SQL*loader to execute SQL script and to the data bulk load.
- Use various functions in Expression Transformation and Aggregate transformation to do the implement Data Cleaning and Aggregation
- Designed and Used Mapplet based on the client’s own rule to standardizing data format.
- Worked with data type such as CLOB, BLOB to migrate passages and graphic data.
- Worked with Web service as sources and used XML Parse Transformation to load data into Flat file or Relational DB.
- Created Mappings for Full load and also for the Incremental Load of the data.
- Mostly utilized the SQL override and Lookup SQL override for filtering the data according to the requirement.
- Used IDE for data analysis, Data Migration and understanding the different patterns of the data.
- Unit testing and Stress testing was done on each mapping separately for performance tuning and verifying the data.
- Used Informatica Pushdown Optimization to push the transformation logic and SQL query down to database to enhance the performance.
- Worked on development and testing environment and then migrated my workflows to the production environment.
- Involved inpeer review, formal review and Quality Control review sessions.
- Provided test queries for the QA team for the validation of the data.
- Designed restart recovery logic and also defined point of Commit Activities for highly volume loads.
Environment: Informatica Power Center 8.5/Oracle 10g, Windows XP professional, SQL, PL/SQL, and SQL Developer, TOAD, XML
- Provide technical guidance to programming team on overall methodology, practices, and procedures for support and development to ensure understanding of standards, profiling and cleaning process and methodology.
- Interact with the team to facilitate development, provide data quality reports, and perform software migration activities and accumulated the data for B2B solution.
- Worked in Informatica 8.x to create and deploy the business rules to polulate data into tables.
- Created Snapshots, Summary tables and views in Database to reduce the system overhead and provide best quality of data for report, worked on cash management and configuration of DAC.
- Creation of presentation layer tables by dragging appropriate BMM layer logical columns in OBIEE.
- Developed Global prompts, Filters, Customized Reports and Dashboards
- Implemented Pivot Tables and manipulated Init Blocks for report analysis.
- Perform data quality analysis, standardization and validation, and develop data quality metrics.
- Insuraing the data quality in Source and Target levels ot generate proper data report and profiling.
- Provide overall direction and guidance to ETL development and support for the Prescription Solutions’ Data Mart. Applying Velocity Best Practice for the Project work.
- Created and used reusable Mapplets and transformations using Informatica Power Center.
- Responsible for the design, development, testing and documentation of the Informatica mapping,
Environment: Windows, UNIX, Informatica 8.x, Oracle 10g, SQL, UNIX Shell Script, OBIEE,, Admin tool.
SQL Server DBA
- Installed and configured SQL server 2005 on windows 2003 advanced servers, applied service packs and security fixes, SSIS and SSRS.
- Created logical and physical data modeling for development database using Erwin Tool.
- Configured the database maintenance plans for database optimization.
- Experienced in performance tuning using windows performance monitor and SQL profiler of SQL server.
- Creating the users, user groups and maintains the access permissions.
- Configured and maintained transactional and merge replication for 2 environments.
- Successfully implemented Log Shipping Technique to maintain a standby server backup for Disaster Recovery.
- Configured Log shipping server to create stand by environment.
- Configured SQL Security features and maintained the user administration.
- Worked on DTS packages for transferring data from various data sources like Oracle, MS Access, Excel, CSV, txt file.
- Developed T-SQL stored procedures as a part of database maintenance plan for new SQL Server 2005.
- Migration of databases from MS SQL server 2000 to MS SQL server 2005.
- Extensively used ADO.NET with the SQL Server Database.
- Performed all aspect of database administration, including backup, recovery, proactive maintenance, identify blocking transactions, archive / purge data and replication for SQL 2005.
- Migrated database from Oracle, DB2 to MS SQL Server 2005.
- Migrated database from MS Access to SQL Server 2005.
- Involved in database designing and capacity planning.
- Rebuilding the Indexes at regular intervals for better performance.
- Setting up database backup and recovery procedures for production, quality and development servers.
- Created data interface that connects online database with external software.
Environment: MS SQL Server 2000/2005, Windows 2000 Advanced Server, Windows 2003 Server, TSQL, Replication, DTS, DBCC, SQL Profiler, Database Tuning Advisor, SQL Server Enterprise Manager, SQL Server Management Studio, SSAS, SSRS, SSIS, XML, DB2, Oracle 10g.