- Software professional with 10+ years of experience in ETL Developer, experience in all phases of life cycle of Requirement Analysis, Functional Analysis, Design, Development, Implementation, Testing, Debugging, Productions Support, and Maintenance of various Data Warehousing Applications.
- 8 years of experience in ETL Data warehousing using InformaticaPowerCenter 9.6.1/9.0.1 /8. x, InformaticaPowerExchange 9.6.1/9.0.1, InformaticaData Quality 9.6.1; Which involvePlanning, Designing, Developing and Implementing Data warehouses/ Datamart with experience of both relational & multidimensional.
- 5+ years of experience in Data Quality, Profiling, Validations, reference check and exception handling using Informatica Data Quality.
- 3+ years of experience in Change Data Capture (CDC) Methodology using PowerExchange 9.0/8. x.
- 3+ years of experience in Teradata 14/13 SQL and utilities like MLOAD, Fast Load, TPT.
- 4+ years of experience in ORACLE PL/SQL extensively wrote code for packages, functions &procedures
- Expert in writing optimizeSQL query using ORACLE, SQL Server, Teradata, MySQL and Hive.
- Exposure in writing SQL using analytical functions like Ranking Functions, Reporting Aggregate Functions, LAG/LEAD Functions, FIRST/LAST Functions etc.
- Experience on Informatica Cloud Integration for Amazon Redshift and AWS S3 bucket.
- Excellent experience in Performance Analysis and SQL query tuning while designing scalable applications from scratch and maintaining existing ones, using Query Optimizer, Indexes, Execution Plans, Hints, Explain Plan and Tk - prof, XML, XSL.
- Experience in creating Transformations and Mappings using InformaticaDesigner and processing tasks using WorkflowManager to move data from multiple sources into targets and Data marts and Data warehouse.
- Expert in performance tuning of Informatica mappings, identifying source and target bottlenecks.
- Experience in BillInmon and Kimball data warehouse design and implementation methodologies.
- Expertise in OLTP/OLAP System Study, E-R modeling, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling.
- Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files.
- Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
- Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.
- Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint.
- Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
- Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch.
- Extensive experience in Data analysis, ETL Techniques, MD5 logic for loading CDC.
- Experienced in writing UNIX shell scripts, SQL Loader, Procedures/Functions, Triggers and Packages.
- Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts.
- Worked many years in onsite offshore model. Provided technical design leadership to ensure the efficient use of offshore resources and the selection of appropriate design, ETL/CDC logic.
- Extensive experience of providing IT services in Healthcare and Banking industries.
- Exposure of end to end SDLC and Agile methodology.
- Strong Knowledge of Hadoop Ecosystem (HDFS, HBase, MapReduce, Hive, Pig, NoSQL etc.)
- Strong experience in handling multiple projects, in multi-vendor environment at the same time with full customer satisfaction.
- Excellent Documentation Skills, Team problem solving ability, Analytical and Programming skills in High speed, Quality conscious and Multi tasked environment.
Operating Systems: WIN NT/2000/XP/VISTA, UNIX (Sun Solaris, AIX, HP, LINUX).
Application Software: MS-Office.
Tools: PL/SQL Developer, TOAD, SQL Developer, Edit Plus,SQL* Plus, SVN.
Languages: SQL, PL/SQL, C, C++, Java, Pro* C.
RDBMS: Oracle 11g/10g/9i/8i, SQL Server 2005, Teradata 13/12v.
SCRIPTING Languages: HTML, Shell, Perl, Python, and Java Script, XML, XSL.
ETL: Data Stage, Informatica Power Center9x/ 8.6.1/8.1/7.1/6.2., SSIS.
Data Modeling Tools: Erwin r7/r4/3.5.
Other tools: Putty, Toad, PL/SQL developer andWinSCP, Crystal Reports.
OLAP: HyperionEssbase and Planning.
Sr. Informatica Developer
- Worked with BA in preparing functional specifications and also involved in user meetings .
- Preparation of technical specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining the standards.
- Perform impact analysis, identifying gas and code changes to meet new and changing business requirements
- Created informatica mappings to extract data from Trizetto Facets 4.51 and 4.3 claims database . Analyzed the Trizetto Facets 4.51 data model and assist the Data architecture team to build claims Canonical data model in the ODS
- Worked with various informatica transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update strategy, Stored procedure, Router and Normalizer etc
- Worked with connected and unconnected stored procedure for pre and post load sessions .
- Conducted SQL testing for database sources for insert and update timestamps, Counts, Data definitions and Error logic for Facts and Reference tables.
- Developed ETL routines using Informatica Power Center and createdmappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
- Extensively used ETL processes to load data from various source systems such as ORACLE, DB2, SQL Server, Teradata and Flat Files, XML files into target system SQL Server by applying business logic on transformation mapping for inserting and updating records when loaded.
- Involved in developing scripts, routines for database consistency checks, and bulk validation for legacy systems.
- Proficient in Performance tuning of multiple Informatica mappings to improve performance as the volume of Internet orders entered the system on a daily basis. The factors involved were PDO (Push down optimization), bulkloading and also adding indexes on the stagingtables.
- Using the PowerExchange for ChangeDatacapture (CDC) option to capture the data whenever the inserts, updates, and deletes underlying these events as soon as they occur and transfer in to multiple targets without intermediatequeues or stagingtables.
- Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
- Sourced data form RDS and AWS S3 bucket and populated in Teradata target.
- Mounted S3 bucket in local UNIX environment for data analysis.
- Worked with Power Center team to load data from external source systems to MDMhub.
- Extensively used InformaticaWorkflowManager and WorkflowMonitor to develop, manage and monitor workflows and sessions.
- Used the databasepartitioning and Informaticapartitioning for better performance when working with the huge data.
- Used XSL for transforming the XML files in implementing the front-end template system.
- Hands on experience with Informatica Power center and implementing data quality rules with IDQ.
- Used CA7 scheduling tool for scheduling the Informatica workflows.
- Understand Session Log files in case the session failed to resolve errors in mapping or session configurations
- Written Unix shell scripts for file manipulation, ftp and to schedule workflows.
- Co-ordinated offshore team on daily basis to leverage faster development.
Environment: InformaticaPower Center 9.6.1/9.5.1 (Repository Server, Repository Manager, Designer, Workflow Manager and Workflow Monitor), MDM (Master Data Management), IDQ 9.1, Teradata 14.1.0, Oracle 11g, ORACLE 11g, AWS RDS, S3 Bucket, Toad, PL/SQL and SQL Server 2008, QlikView, SQL, UNIX Shell Scripts, XML, XSL.
Confidential, Tampa, FL
Sr. Informatica Developer
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data Warehouse .
- Developed number of mappings, Maplets, reusable transformations to implement the business logic and to load the data incrementally .
- Developed mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations .
- Used various transformations like Filter, Expression, Sequence Generator, Update strategy .
- Used Power Center server manager/Workflow manager for session management, database connection management and scheduling .
- Created multiple Type 2 mappings in the Customer mart for both Dimension as well as Fact tables, implementing both date based and flag based versioning logic .
- MDM Hub configurations - Data modeling, Data Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring .
- Define requirements for data matching and merging rules, survivorship criteria and data stewardship workflows that can be deployed in MDM implementation .
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Translated the PL/SQL logic into Informatica mappings including Database packages, stored procedures and views.
- Administered InformaticaPowercenter and change data capture software which was used for data integration.
- Performed Admin Jobs and Tasks, reviewed logs, Deployed ETL Codes.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
- Set up Users and User accounts as per requirements.
- Worked Extensively with version control software(Github)
- Actively Participated in Data Quality services and frameworks for Data Quality
- Created UNIX scripts to read/write and ftp files from and to windows servers and UNIX.
- Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
- Performing ETL and database code migrations across environments using deployment groups.
- Populating the business rules using mappings into the target tables.
- Developed Informatica Data Quality Mappings, sessions, workflows, scripts and orchestration Schedules.
- Involved in end-to-end system testing, performance and regression testing and data validations.
- Worked extensively on modifying and updating existing oracle code including object types, views, PL/SQL stored procedures and packages, functions and triggers based on business requirements.
- Worked in agile minor release cycles the designated database developer.
- Unit test and support QA and UAT testing for database changes.
- Managed performance and tuning of SQL queries and fixed the slow running queries in production.
- Help support Data masking projects for DRD for all Dev, QA and UAT environments via Enterprise Data Obfuscation.
- Build creation, verification and deployment to QA and UAT using Transporter.
- Created batch scripts for automated database build deployment.
- Extensively worked on Teradata
- Map the Prepared projectplan and obtain management approval of the project plan.
- Applied projectresources according to the approved project plan.
- Analyzed risk and instigate avoidance activities. Establish contingency plans and identify trigger events and responsibility for initiating corrective action.
- Prepared projectstatusreport.
Confidential, Portland, OR
- Worked with the TechManager and BusinessAnalysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
- Performed problemassessment, documentation, performance review and qualityvalidation on new and existing BIenvironments.
- Created the ETLtechnicalspecification for the effort based on business requirements. The source system being mainly OracleMasterData.
- Worked closely with Businessusers, BusinessAnalyst and SharedServicesTechnicalLeads in defining technical specifications and designs for Oracle based largedatawarehouseenvironment.
- Developed detailed ETL specificationbased on technicalspecification for BI effort within the ETL design standards and guidelines.
- Served as S&P’s ETL expert - performingtechnical development, maintenance, tuning, and support activities.
- Ensured testing data is available for unit, system and acceptance testing within development and QA environments.
- Unit tested ETL code/scripts to ensure correctness of functionality and compliance with business requirements.
- Refined ETL code/scripts as needed to enable migration of new code from development to QA and QA to production environments following Confidential migration signoff procedure.
- Performed system testing, integration testing and helped user for UAT based on business requirements.
- Coordinated among multiple teamETLAdmin, UXIX admin and management team for approval for ETLdeployment.
- Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
- Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
Environment: Oracle 11g/10g/9i/8i, SQL, PL/SQ, UNIX, Shell scripting, XML, XSL, Informatica Power Center 8.6, .net, crystal reports, Oracle Hyperion Essbase and Planning, MS SQL Server, COBOL, JCL.
Confidential, San Jose, CA
- Involved in design of database and created Data marts extensively using StarSchema.
- Worked extensively on SQL and UNIX shell scripting.
- Involved in developing packages for implementing business logic through procedures and functions.
- Extensively involved in application tuning, SQL tuning using Explain Plan.
- Creating high level and low level functional and Technicalspecification document for application development.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and loading the data into different target tables.
- Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logicaldatamodeling.
- Used Teradata Utilities BTEQ, Fast Load, Multi Load utilities for loading bulk data.
- Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
- Used SQL tool TeradataSQLAssistant to run SQL queries and validate the data in warehouse.
- Performed load and integration tests on all programs created and applied version control using Harvest tool to ensure that programs are properly implemented in production.
Environment: Oracle 11g/10g/9i/8i, SQL, PL/SQ, UNIX, Shell scripting, Informatica Power Center 8.6, Teradata TD 12, UNIX Shell Scripts, Harvest and Control M.
Confidential, Chicago, IL
- Created and Modified PL/SQL Triggers, Procedures, Functions and packages .
- Developed SQL scripts to create database objects like tables, views and sequences .
- Experience in Data warehouse concepts, ETL .
- Designed database tables using various Normalization techniques and database rules .
- Developed complex triggers in reports before/after for validation of user input .
- Conducted tuning for SQL and PL procedures Informatica objects and Views .
- Performed unit testing and supported integration testing and end user testing .
- Extensively worked on production issues with effective defect management .
- Prepared backup schedule which include offline, online and logical backups.
- Implemented Oracle packages using pipelined table functions.
- Implemented virtual private database for data security.
- Implemented referential integrity and business rules as per design document.
- Worked on analytical functions and autonomous transactions.
- Developed UNIX shell scripts with embedded SQL*Loader, expdp/impdp calls and PLSQL statements to extract data in the form of flat files for data loading and migration.
- Monitored the performance of the database generating AWR reports.
- Implemented UNIX shell scripts for data migration and batch processing.
- Implemented referential integrity constraints and business rules as per design document.
- Worked extensively UTL FILE utility for file operations.
- Worked on Oracle 10/11g/Exadata, including designing tables, stored procedures and functions, triggers and performance tuning.
- Participated in code reviews, maintained checklist and document preparation.
Environment: Windows NT/2000, UNIX, Oracle 11g/10g/9i/8i SQL, PL/SQL, Java, ERWIN 4.7.