- 20+ years of total IT experience, in Business Requirement Gathering, Analysis, Application Design, Data Modeling, Development, Migration, Implementation and Testing of Data warehouse applications for Power sector, Pharmaceutical, Financial, Healthcare and Insurance industries.
- Around 15+ years of Data warehousing experience utilizing Informatica 1.7 / 4.x / 5.x / 6.x/7.x/8.x (High Availability, Team Base )/9.x/10.x for ETL (Extract, Transform, Load), Data Engineer Integration (BDM), Data Engineer Quality(BDQ), IDQ and IICS on Sun Solaris, Linux, AIX, HP - Unix and Windows operating systems.
- 8+ years of Data Modeling experience using Relational, Enterprise and Dimensional Data modeling, Worked on various architectures like Independent, Bus Architecture, Hub-and-Spoke (corporate information factory), Centralized Data Warehouse & Federated Architecture and created Physical & logical data modeling using ERWIN 3.x/4.x/7.x
- Extensive knowledge in OLAP Tools such as, Business Objects 5.1.1 (Supervisor, Designer & Web Intelligence 2.6) & Cognos (Impromptu administrator 5.0, Power Play 6.5, Scheduler 5.0, Impromptu Web Reports & Script Writer).
- Extensive 10+ years of database experience using Oracle 11g/10g/9i/8i/7.x (PL/SQL, SQL*Plus, Developer 2000), Sybase SQL Server 12.0/11.x,Teradeta V2R 4.0/5.0/12/15 , Netezza 4.0.x/3.x, DB2, MS SQL Server 2000/7.0/6.5 , MS Access 7.0/2000, and Informix on Win 3.x/95/98/2000, Win XP, Win NT 4.0, HP-UNIX 11.1 and Sun Solaris 2.x.
- Knowledge in data integration architecture to provide though leadership around best practices and standards across Informatica Intelligent Cloud Services (IICS), with a focus on cloud application integration and API management between cloud and on-premise systems.
- Lead the technical teams for On-shore/Off-shore Global delivery. Interaction across all levels of Business and Technology management. Strong and effective management of technology, process and people. Mentored and developed junior staff members.
- PASCAL, COBOL, PL/SQL; Oracle 11g/10g/9i/8i/7.x, MS-SQL Server 2000/7.0/6.5 , DB2 UDB 7.0, Netezza 4.0.x/3.x, Teradeta V2R 4.0/5.0/12/15 and MS Access
- ASP, Visual Basic 6.0,Informatica (Power Mart/Power Center 4.7/1.7/5.1/6.1/6.2/7.1/8.1.1/8.6/ 9.1.0/9.5.0 /9.6.1 ), IICS, Power Connect 8.x for SAP/Peopesoft/DB2/ Netezza/Siebel/TPT
- Power Exchange 5.1/8.x, Informatica B2B Data Exchange 8.6.1/8.6.2 , Metadata Manager 8.6.1,IDE 9.1.0/9.5.0 , IDQ 9.1.0/9.5.0 , Web Services, Data Mirror 6.0, Kimball Methodology
- Cognos (Impromptu administrator 5.0, Power Play 6.5, Scheduler 5.0, Impromptu Web Reports, Script Writer), Business Objects 5.1.1 (Supervisor, Designer, BO) Web Intelligence 2.6
- Erwin 3.x/4.1/7.2.5, Trillium 5.0, Toad 7.4/8.5/9.2, SQL Navigator 4.1/3.2/188.8.131.522 , composite 4.6/Data Federation 8.6.1, XML, Visual Source Safe, Web sphere, and Apache Web Server.
- Trained Data Owners and Stewards to triage quality problems found via Data Watch Control room dashboards
- Build BDQ Enterprise Data Discovery profiles, link rules, join profiles, overlapping discovery, foreign key, etc. to identify exceptions.
- Experienced in defining systems strategy, developing system requirements, designing and prototyping, testing, training, defining support procedures, and implementing practical business solutions under multiple deadlines
- Executed standard onboarding of technical and business metadata into the Informatica EDC and Axon environments, ensuring the population of data lineage and linkage between the technical and business metadata
- Participated in the development and implementation of enterprise metadata standards, guidelines, and processes to ensure quality metadata and support for ongoing data governance.
- Facilitated discussion with data stakeholders on data governance processes, and translate those requirements into Axon workflow.
- Participated in training sessions for business and technology partners covering enterprise metadata standards, guidelines, and processes.
- Created and maintained data governance framework that establishes roles and responsibilities for data governance and decision making to meet the enterprise's data objectives and goals.
- Investigate data quality related issues and trends via dashboard which are sourced data from BDQ, EDC and AXON using Rest API’s.
- Hands on development experience using Informatica Enterprise Data Catalog (EDC), Axon Data Governance and Informatica Big Data Quality (BDQ)
Environment: ETL-Informatica Data Engineer Integration (BDM), Data Engineer Quality(BDQ) 10.4, AXON 6.3, EDC 10.4, Secure@source, Hadoop 2.x (Platform), HBase, HIVE (hql), Sqoop, Spark, Postgres 12.4, Snowflake, Oracle, Mango DB, Amason S3/Glue, MS SQL Server, Flatfiles, JSON, XML, PowerBI
- Created and executed profiles in Informatica Developer to discover data quality issues in a data set and to understand the relationships between columns in a data set.
- Worked independently to create ad-hoc analyses and troubleshoot issues at the request of users.
- Worked closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs. performed peer code review, analysis, requirements gathering, assisting offshore developers, developed/Tested ETL objects
- Hands on development of the POC’s and Framework, used/managed Offshore and Onshore implementation team to deliver in a significantly reduced time to meet the aggressive time line.
- Prepared architectural guideline documents related to data naming standards, list of valid class words, naming and abbreviation standards glossary (.nsm) which can be imported in ER-Win
- Creation of B2B DX partners, profiles, workflows, endpoints for onboarding clients.
- Worked on Split process, Merge process of multi- client files using Informatica B2B Data transformation and Informatica B2B Data Exchange.
- Performed End to end Data analysis and Data Lineage tracking from Source System Target Downstream System and Conduct data analysis and investigation when issues with data are identified
- Documented Best Practices and troubleshooted complex situations land modified the code behave optimally and efficiently.
- Architected and deployed a complete end-to-end solution
- Responsible for migration of data out of STAR system to oracle database. Designed the logical and physical data models, provided data flow diagrams, transformation logic, source to target data mapping spreadsheets, data dictionary etc.
- Drive discussions with business partners to extract and document the requirements and critical business logic necessary for the data and operational reporting needs of the business lines, and recommend solutions to enable the business lines to meet their goals.
- Create end-to-end data and business flows, leveraging insight maps to visualize and expose impacts, independencies, duplication, fragmentation, and more using various facets of AXON
- Hands-on experience using Informatica Data Services (v9.x), Data Quality Developer (v9.x) and Informatica Data Quality Analyst tool (version 9.x) to analyze, profile, and develop cleansing and standardization rules, EIC, AXON 5.0/5.2/HotFix2/6.0
- Developed reusable codes, Reference data look-ups, common mapplets to handle error logging and Error handling, Data exception logging.
Environment: ETL-Informatica - 10.2/ 9.1.0/9.5.0 /9.5.5/9.6.1/8.1.1/8.6(High Availability, Team Base), IDS 9.5.0, Power Exchange/Connect 8.0 (for CDC/B2B/RedShift), IDE 9.1.0, IDQ 9.1.0, Web Services, AXON 5.0/5.2/6.0, EIC/EDC, composite 4.6/Data Federation 8.6.1, Power Center Metadata Manager 8.6.1, Sales Forge, Veeva, OBIEE, Spotfire, Trillium 5.0, Tableau, Business Objects, Teradata 15/12 (fastload, multiload & tpump), PowerExchange for Teradata Parallel Transporter API, Oracle 10g/11g, SQL Loader, MS SQL Server 2000, SQL Navigator 5.5/4.1/3.2, Flat Files, Toad 184.108.40.206 /8.5.1/ , Visio 2007, ERWIN 3.5.2/4.x/7.2.5, SharePoint, Shell Script, Unix SUNOS 2.8 and Windows XP.
Confidential, Swiftwater, PA
- Served as DW/ETL Architect for a global strategic enterprise implementation of Sales & Marketing DW and Operational Data store.
- In-depth knowledge on Informatica B2B Data transformation for Complex XML parsing.
- Reduced time to market for New product On-boarding from 12 months to 8 weeks.
- State of Art Error management and exception handling Strategy.
- Designed Complex DB model for Reference and, Transaction and data quality Exception.
- Provided suggestion to improve Enterprise Architecture (Advanced XML like FTML) for the benefits of data sourcing for Data warehouse and the Down Stream consumers.
- Documented Best Practices and troublesooted complex situations like resource, cpu usage and modified the code behave optimally and efficiently.
- Developed reusable codes for B2B XML Parsers, Reference data look-ups, common mapplets to handle error logging and Error handling, Data exception logging.
- Designed the parameterized code base to support multiple environment with multi-Stacks.
Environment: Oracle 11g (v2), Informatica 8.6.1 (HF-12)/9.x, sql/Plsql, ksh, Sybase Power Designer, MS Visio, B2B Data Studio 8.6.2, Power Exchange (SAP), Auto Sys.
Confidential, Broadway, NY
- Managed the ETL offshore developers and ETL functions within the Data Warehouse group
- Assist team members in developing complex ETL using Informatica, SQL and UNIX Shell Scripts to meet data load requirements of business
- Provided technical leadership and consider long-range technical issues
- Forecasting the completion dates and reliably delivering against those estimates for projects and work requests
- Allocation of team members to projects and work requests to ensure project schedules are met
- Interface with business users, project managers and technical teams to define requirements and specifications
- Involved in Tuning SQL, PL/SQL codes, Stored Procedures and Database Trigger .
- Work with the Data Warehouse Reporting team, Web Engineering team and Business Systems team to understand data sources and implement new requirements
- Developed various mapping using Informatica Power Center, Testing the mapping.
- Responsible for mentoring various team members with Informatica and the overall architecture of the ETL process.
- Automated the Process for scheduling the job using Autosys Scheduler and from UNIX command line.
Environment: ETL-Informatica - 8.1.1/8.6(High Availability, Team Base), Oracle 10g, Autosys, Unix, Erwin 7.2 and HP UNIX
- Analyze and design Common Dimension Layer (CDL) and implemented Slowly Changing dimensions (Type1, Type2, Type3 & Type 6).
- Created projects estimates and developing various scenarios to come up with the best.
- Wrote documents for ETL mapping design, error log design, Data load strategy, unit and system testing, system migration and job schedules.
- Implement a zero-defect data quality process that includes rigorous checkpoints throughout the systems development lifecycle (SDLC), regression testing, end-to-end traceability of data transformations, and automated reconciliation
- Resolved data and process support Issues. Defined and deployed QC process to help the Business Team to cleanse the source data.
- Analyzes 3rd party clinical trial flat files data from CRO/IVRS, FSP, ClinPhone and designed/developed mappings.
- Designs overall error handling strategies, loading strategy’s for various projects.
- Helped the ETL Testing team for effective testing of transformed data and defining the test cases.
- Designed Packages to make changes to “Unix/Perl/Netezza SQL Scripts”. Implemented designed Packages by using check in and check out process.
- Creates materialized views, table partitions, dynamic index (Bitmap, B-tree) creations, analyzed tables and indexes, Wrote Optimal PLSQL packages and procedures using bulk collect and bind methods. Tuned query for source data selection and populating fact and summary tables.
- Implemented Change Data Capture (CDC) technique to detect the changes and load accordingly into the warehouse.
- Ensured deadlines are met; all best practices and enterprise standards are followed.
- Working with Power Exchange Client for Power Center in real time extract changed data that represents the changes to a source from the change stream.
- Worked with various transformations like SQL, Flexible Key, Application source qualifier, Application Multi Group Source Qualifier etc.
- Conduct code reviews and enforce quality standards for changes and enhancements
- Worked on Data Mirror for creating a pro-type project for Oracle Clinical.
- Performed ETL performance tuning with modifying Index and Data Cache size, DTM Buffer size, Pipeline Partitioning, defining additional partition points and modifying partition type.
- Handled tables from 30 millions to 2 billion records using various techniques like brute force, CDC, timestamp and difference of views.
- Acquired knowledge on service oriented architecture of informatica 8.x
Environment: ETL-Informatica - 8.1.1/8.6(High Availability, Team Base) /6.2.2 Power Center, Power Exchange/Connect 8.0 (for CDC), Power Exchange (for SAP), composite 4.6/Data Federation 8.6.1, Power Center Metadata Manager 8.6.1, Trillium 5.0, Business Objects, Oracle 220.127.116.11/18.104.22.168 64/10 g, SQL Loader, MS SQL Server 2000, SQL Navigator 5.5/4.1/3.2, Flat Files, Toad 22.214.171.124 /8.5.1/ 8.0/7.2, Visio 2003, ERWIN 3.5.2/4.x/7.2.5, Live link, SharePoint, Concurrent Version System (CVS), Data Mirror 6.0, Tortoise, Shell Script, Unix SUNOS 2.8 and Windows 2000/XP.
Lead. Data Warehousing Developer
- Decided the error handling, reconciliation process and educated the developers.
- Interacted with business users and gathered the requirements and detailed out into technical specification for PCSA.
- Created data base objects for updating protocol category and subject area on HTS, DG and LTS (operational data base) and RSDB (Warehouse database).
- C reated MS-DOS command files and SQL scripts for the conversion process.
- Created procedures, functions, triggers, packages for the target system, which is Activity Base 5.0.10 database schema implemented on Teradata, Oracle Database sever and RSDB data warehouse.
- Worked extensively on various Teradata load utilities like FastLoad, MLoad & TPump to populate data into Teradata tables.
- Created test plan, test design, test cases, test reports and anomaly reports in the process of testing the application.
- IDBS Activity Base is a suite of applications to capture, analyze, manage, and utilize biological and chemical data.
Environment: ETL-Informatica - 7.1 Power Center, Teradata V2R4.0, Oracle 8i, Visio 2003, IDBS Activity Base 5.0.10, SQL-loader, Toad 7.x, HP-Unix 11.1
- Installation, configuration and optimization of Informatica on Windows 2003.
- Gathered requirements from the business users and business analyst and created estimates for Informatica mappings.
- Identified and mange project risks, define standards, issues and conflicts
- Prepared the functional requirements by inter acting with the business users, high-level & low-level design documents.
- Decided the error handling, reconciliation process and educated the developers.
- Defined ETL design and implementation standards and procedures for MIG2, FIS, MTS and MDS systems
Environment: ETL-Informatica - 7.1 Power Center (Designer, Repository manager, Workflow manager, Workflow Monitor), UDB DB2, COBOL, Informix, MS SQL Server 2000, Cognos Report net 1.1, PVCS, Visio 2003, ERWIN and Windows 2003