Oracle Sql, Informatica Lead/architect Resume
SUMMARY
- 9+ years of IT Experience in Software Analysis, Design, Development, Testing and Implementation of Client/Server Business systems like Business Intelligence and Data warehousing.
- Translate requirements into various documentation deliverables such as functional specifications, workflow /process diagrams, and data flow/data model diagrams.
- Experience in creation of Dashboards on the Qlikview tool and changing oracle scripts to hive scripts in Big Data.
- Experience in Informatica Client tools: Power Center Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Experience in all phases of Software Development Life Cycle (SDLC) such as development, testing, migration, administration, and production support on various platforms like UNIX, Windows 10.
- Experience in gathering, analysis and documenting business requirements, functional requirements. Designing and developing the mapping based on the requirements.
- Experience in implementing the Business rules by creating transformations (Expression, Aggregate, Unconnected and Connected Lookup, Router, Update Strategy, Filter, Joiner, Union), and developing Mappings.
- Strong knowledge on Data warehousing concepts, Data mart, Star Schema and Snow Flake Schema modelling, Fact tables and Dimensional tables.
- Implemented Slowly Changing Dimension methodology for accessing the full history of accounts and transaction information.
- Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.
- Experience in Migration of objects between environments, from Local (DEV) to Test (QA).
- Experience in writing complex SQL queries and Unix Shell Scripting.
- Excellent experience in Performance tuning in SQL and query optimization.
- Strong knowledge on Relational databases like Oracle 9i/10g/11g/ 12c on different platforms like Windows/Unix Linux using GUI tools like TOAD, PL/SQL Developer.
- Analytical and Technical aptitude with the ability to solve complex problems.
- Strong interpersonal skills with ability to interact with end - users, managers and technical personnel.
- Well versed in Agile and Waterfall Methodologies.
- Experienced and well versed in writing UNIX Shell scripts as part of file manipulation, scheduling and text processing.
- Experience in Data Warehouse Full Life Cycle and Methodology.
- Worked as Data Modeler for database creation for data migration.
- Involved in Unit Testing, Integration testing and QA Validation.
- Experience in designing and developing database applications and data support systems (ETL/ELT) for Data Warehouse / Data Mart databases, Data Staging, Operational Data Stores and Data summarization for Decision Support Systems (OLAP/BI/Analytics) of various data volumes employing data models, requirement specifications, functional specifications, business rules, validation rules, system statistics, etc.
- Experience in design and development of various Data Integration, Data Migration and ETL applications in different types of SDLC processes employing Data mappings, UNIX shell scripting / Windows scripting, Database programming such as UDF, stored procedures, functions, SQL scripting.
- Experience in various data profiling methods to perform analysis of structured, semi-structured and unstructured data sources and data consumables.
- Experience in application metadata gathering and metadata analysis of system configuration, application complexity, runtime statistics, etc.
- Experience using Informatica Power Center / Power Exchange / Analyzer / Developer, Oracle Warehouse Builder, External Loaders such as Oracle SQL*Loader, PERL, etc.
- Experience in implementing Data Quality Processes to perform Data Cleansing / Data Matching and Data Standardization using IDQ, Data Flux, etc.
- Experience in tuning database queries and ETL application processes. Good understanding of database and data systems optimization.
- Experience in Data Masking, Data Subset for non-production data environments with Informatica Power Center.
- Experience in Informatica Power Center Load balancer and HA clustered systems. Involved in process recovery and dynamic partitioning solutions in multi-node Environment.
- Thorough understanding of mapping / transformation rules, use cases, data models / dictionaries and other artifacts constituting database and backend application systems design and development.
- Understanding of Modern Data Platforms in Hadoop ecosystem namely Hive database platforms and data flow systems.
- Define, analyze and translate functional and non-functional Business requirements into activity diagrams, system use cases/user stories and supplemental specifications that contribute to effective software development.
- Conduct Gap analysis on existing processes and potential alternatives. Identify and report on trends and patterns found within the data.
- Design, develop and test the implemented extraction and transformation programs, processes and cycles for historical and current data loads.
- Used logical data models, physical designs, and business rules to create mapping definitions.
- Enforce Data standards, resolve data issues, complete testing and create and update system documentation.
- Analyze complex business requirements and design, build, and implement technology enabled solutions to address multi-discipline business opportunities/requirements.
- Investigate, analyze and address complex technical problems within the ETL processes.
PROFESSIONAL EXPERIENCE
Confidential
Oracle SQL, Informatica Lead/Architect
Responsibilities:
- A technology platform that addresses the monolithic nature of the current system and moves to a solution that efficiently and effectively facilitates maintaining, making changes and enhancements for the solution based on business demands and needs.
- Enabling of robust process & service orchestration with industry standard business & compliance validations using modern design patterns and optimizing all processes involved in end-to-end lifecycle of Encounter and related claims.
- Provide modular User Interface to monitor the encounter with agility to adapt/change the rules based on changing business needs of the trading business partners.
- Improve efficiency and accuracy in Encounter creation process with auto correction capability.
- In the event human intervention is required for correction, build a flexible error correction management workflow with ability for effective workload management while providing visibility on the error corrections.
- Design and development, unit testing, integration, deployment packaging and checkout, scheduling, of various components in Informatica Power Center and Solaris through several SDLCs for implementing ETL processes for Oracle based very large data warehouse and ODS integrated with data for subject areas from disparate data sources and processes supporting downstream data marts for DSS/BI applications.
- Performed analysis of ETL applications and database systems, data structure analysis, data profiling for application design and defects resolution.
- Performed process enhancements such as capacity improvement, tuning/optimization and performance improvement of the Informatica Power Center and Oracle based ETL processes.
- Performed extensive metadata gathering and analysis of database and metadata repositories for understanding data lineage, data flow, process design, process statistics and identifying storage elements and access to data using JSON.
- Partnered with DBA/SA to optimize database tables/indexes, partitioning, parallelization and in implementing database/client configurations to support pipeline partitioning and multi-threaded processing.
- Participating in Big Data Initiative using Cloudera managed Hadoop ecosystem for storage of ODS data for data analytics, in Actian Vector on YARN based Hadoop cluster.
- Performed system analysis and created both functional and technical specifications documents. Designed and developed custom PL/SQL packages and procedures.
- Assisted junior developers with design and development of PL/SQL packages, Forms and Reports. Modified forms and reports for performance improvements.
- In the event human intervention is required for correction, build a flexible error correction management workflow with ability for effective workload management while providing visibility on the error corrections.
- Streamlined SDLC processes by automating data sampling and data comparison for testing and analysis of source and target data for ETL application development, QA and UAT.
- Re-Platforming: Involved in database migration from PADB to Vertica for applications above and Appraisal Data Analytics requiring, rapid conversion methodologies and integration conversion solutions to convert legacy ETL processes into Vertica compatible architecture using regular expression driven metadata conversion such as ODBC reads and Power Exchange bulk writers and replacements such as bulk merges, deletes with ODBC.
- Performance Tuning: Enhanced and troubleshoot ETL process runtime contingencies applying Database tuning, query tuning and application tuning solutions.
- Informatica Power Exchange Product Defect: Solution and development of alternative method to reconcile rejected data resulting from defective string operations occurring in the Vertica plugin by analyzing issue and developing reusable Unix shell script components and Informatica configurable components instanced in hundreds of ETL workflows.
- Vertica Special Character handling by ODBC, JDBC, Informatica and SAS: Analysis of character handling, design, and develop solutions to handle non-printable and high octal symbols such as Spanish alphabets removing contingencies of code page conscious conversions between various transport layers between application and database.
- Design and Architecture, ETL and Data application systems and frameworks: Performed requirements analysis, systems analysis, data analysis and designed application systems and framework architecture incorporating/implementing functional and non-functional systems specifications governed by enterprise business rules, security policies, environmental and application frameworks, delivered in both agile adopted and non-agile SDLC programs.
- Performed service requests, access requests, deployment requests, role/user requests, operations requests etc. and related workflow management with Service Now.
- Hadoop and Big Data: Data ingestion into Hive (ACID, transactional, bucketed, managed, external tables) using Sqoop.
Environment: Informatica Power Center 9.6/9.5/9.1, Oracle 12c/11g, Toad/SQL Developer, SQL scripting/tuning, PL/SQL, XML, Solaris 10, Unix Shell Scripting
Confidential
Oracle SQL Lead/ Informatica Lead Developer
Responsibilities:
- Conceptualized Audit and Control Framework for ETL application system for Enterprise Data Warehouse initiative at SSA on Informatica Power Center platform. Implemented the framework by designing and developing ETL components in Informatica, Oracle and shell scripts.
- Lead ETL development as SME in Informatica Power Center and Power Exchange application development by providing expert solutions for ETL application development and data warehouse development for EDW program that also involved analysis of Informatica Metadata, data profiling of the warehouse, data distribution in the MPP database, etc.
- Performed setting up and configuration of application services for EDW and DCPS application programs. Created maintenance tasks by developing domain backup, repository backup, file purging, activity logging, etc. scripts. Configured security by folder configuration, LDAP configuration, security domains, OS profiles, etc. Implemented SSL for Informatica Power Exchange for external bulk load utilities.
- Troubleshoot with Informatica to rebrand the ODBC drivers by identifying parallel performance and driver error issues.
- Implementing Data Security Policies in the Informatica Dynamic Data Masking server.
- Configuring DDM environments planned for enterprise wide databases classified by data applications.
- Developing and configuring DDM services, connection rules, and security rules for clustered DDM server nodes for Partial, Full and No Masking policies and user access classification.
- Developing Informatica DDM best practices for data security policy management, maintenance and deployment that requires migration, replication and synchronization of security rules, domains and services.
- Troubleshoot QA defects in rule matchers and rule actions for data masking and rewrites for SQL requests containing simple and nested queries. Masking for multi-result sets stored procedure database requests.
- Improved performance with best possible optimal combination of Informatica, UNIX Shell Scripts and database SQL or Procedure objects by choosing the better over the lesser.
- Improved performance of ETL processes by employing performance bottleneck resolutions to Informatica Mappings, configuring Transformations.
- Implemented business rules for ETL and ELT using combination of transformations in Informatica mappings, UNIX shell scripting and Oracle SQL PL/SQL Stored Procedure components for data transformation, data load / update strategies for data warehouse to assist high performance data mining activities.
- Used Power Exchange Change Data Capture and bulk extraction for Oracle and mainframe, Informatica Data Quality (IDQ).
- Participated in ETL system audits with Informatica to ensure feasible standards and practices.
Environment: Informatica Power Center 9.6/9.5/9.1, Oracle 12c/11g, Toad/SQL Developer, SQL scripting/tuning, PL/SQL, XML, Solaris 10, Unix Shell Scripting
Confidential
Oracle SQL Lead/Informatica Lead Developer
Responsibilities:
- Extracted data from various heterogeneous sources like Oracle, SQL Server, Flat Files
- Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Developed complex mapping using Informatica Power Center tool.
- Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
- Created Sessions, Tasks, Workflows and worklets using Workflow manager.
- Worked with Data modeler in developing STAR Schemas
- Involved in performance tuning and query optimization.
- Used TOAD, SQL Developer to develop and debug procedures and packages.
- Involved in developing the Deployment groups for deploying the code between various environments (Dev, QA)
- Created pre sql and post sql scripts which need to be run at Informatica level.
- Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
- Used Debugger to fix the defects/ errors and data issues.
- Expertise in using both connected and unconnected Lookup Transformations.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
- Monitored and improved query performance by creating views, indexes, hints and sub queries
- Extensively involved in enhancing and managing Unix Shell Scripts.
- Developed workflow dependency in Informatica using Event Wait Task, Command Wait.
Environment: Informatica Power Center 9.1.0, Oracle 11g, SQL Server 2008, MS Access 2010, SQL*Loader, UNIX, Putty, SQL
Confidential
Oracle SQL/Informatica ETL Developer
Responsibilities:
- Involved in Dimensional modeling of the Data warehouse and design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Developed number of complex Informatica mappings, mapplets, and reusable transformations to implement the business logic and to load the data incrementally.
- Worked on performance tuning of SQL and mappings by usage of SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations.
- As per the requirement of the business users to manage change data capture, implemented slowly Changing Dimensions type I and type II.
- Worked with Persistent Caches for Conformed Dimensions for the better performance and faster data load to the data warehouse.
- Used Power Center Workflow manager for session management, database connection management and scheduling of jobs to be run.
- Developed Static and Dynamic Parameter Files for reusability and database connection management among Development/Testing/Production environments.
Environment: Informatica Power Center 9.1.0, Oracle 10g, SQL Server 2005, SQL*Loader, SQL, TOAD, UNIX and MS Office.