- 13 years of experience in Development in Data warehousing using informatica Power Center, Informatica IDQ, Oracle and Teradata and Big data technologies Hadoop and Hive.
- Currently working as a Tech Lead.
- Has good amount of experience in Coding on Informatica, IBM DataStage, Oracle PL/SQL and Teradata, batch scripting and UNIX shell scripting.
- Strong skills in Data Analysis, Data Requirement Analysis, Data gaps, data migration, data governance, root cause analysis and Data Mapping for ETL processes
- In - depth knowledge in the Teradata cost-based query optimizer, identifying potential bottle necks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI/USI/NUSI/JI etc.) etc.
- Strong hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, MultiLoad, FastExport, Tpump, Viewpoint), TPT.
- Experienced in working with design and architecting projects for Data warehouse, executed complete re-design of architecture and stabilized the architecture.
- Experience in working with Azure Blob Storage, Data Lake, ADF and Databricks.
- Good understanding of Spark Architecture
- Expertise in handling complex requirements and projects, build design and develop the ETLs for it.
- Experience in interacting with client directly for specific requirements to understand the requirement exactly and providing solution with first time right approach
- Good skill in identifying root cause for issues in production and quickly provide fixes
- Good working and practical knowledge in all aspects of SDLC like requirement analysis, design, development, testing, implementation, and maintenance of Projects, agile projects using scaled Agile.
- Exposure with Teradata, Hive and Oracle
- Possess excellent Communication, Interpersonal & Analytical skills. Abilities in working in strict deadlines and diverse conditions.
Databases: Oracle 10g, Teradata, Hive
ETL: Data Ware Housing Concepts, Informatica Power Center, Informatica Developer (BDE), IBM DataStage
Tools: Autosys, PVCS, Dollar Universe, Kintana, TES, Artifactory
Scripting: Shell scripting
Domain Knowledge: HEALTH CARE, MANUFACTURING, ASSET MANAGEMENT
- Work with the Product Owners, Business Analysts, and client areas to prepare functional specification during the project requirements stage.
- Design, development, and unit testing of mappings using informatica Power Center and IDQ for the requirements.
- Creation of stored procedures for complex design like handling temporal and bi temporal table in Teradata.
- Design and develop integrated solutions for complex requirements like handling multiple source windows, member, and group address from Facets.
- Implementation of Address Doctor for address standardization using Informatica IDQ.
- Automation of PBM loads by designing adjustment process for the existing manual adjustment process and handling the file loads validation through automated audit.
- Design and development of archive and purge mechanism for incremental data tables for easier trouble shooting with flexible retention period of data at each table level.
- Performance tuning of long running queries through redesign reducing batch run time by 40% to meet SLA.
- Quick root cause analysis and fix for production issues.
- Automation of DB objects deployment process using batch script.
- Consolidation of control-m jobs through automation saving customer $160K in licensing cost.
- Oversee design reviews to proactively identify potential problems and suggest alternate design.
- Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to load files to Data lake and copy to Teradata.
- Brainstorming for the new design and architecture to be used for the jobs.
- Building mappings and workflows using the current logic in PowerCenter ETLs.
- Proactively identify the limitations and build workarounds to overcome the limitations and mitigate the risks.
- Redesigning of the architecture which had bugs and needed stabilization
- Proactive identification of risks with architecture provided and fixing them
- Mentoring onsite and offshore team and training them in Informatica tool and new architecture KT sessions to multiple teams
- Meeting timelines despite tight timelines and redesign implementation within time which was not planned as part of project.
- Developed mapping, workflows and logic building for metrics
- Handled ADT and DLP end to end coordinating with business directly
- Reduced load and development time for replica creation.
- Creation of mappings, workflows as per the requirement.
- Analysis of legacy data sources and converting them to ETL structure.
- Creation of PL/SQL objects and building reports.
- Fixing issues in mapping and workflows.
- Performance tuning of ETL.
- Creation of Autosys jobs.
- Gathering requirements from client.
- Helped senior data modeller in designing the data model for the requirement.
- Creating design documents for the data model.
- Helped teammates with ETL design for development in IBM DataStage.
Sr Software Engineer
- Creation of the Audit Jobs in the Data flux tool
- Query writing in Teradata for data retrieval and updating the data.
- Supporting the Quality runs.
- Uproc and Session creation and modifications.
- Issue Fixing in the Informatica Mappings and Sessions.
- Supporting the FPR and issue fixing in the FPR.
- Performing Unit Testing and Integration Testing in Dollar Universe.
- Scheduling of the Informatica and Data flux jobs using the $U.
- Root cause analysis for the missing and mismatch records in the Source and target tables.
Sr. Software Engineer
- Modified Informatica mappings and workflows as per the requirement.
- Creating CRs to resolve issues in the existing ETLs or data issue.
- Resolving Issues while loading the data to database through utilities.
- Performance tuning for the workflows which are taking longer time for execution.
- Finding out Root Cause for the Mismatches in the table amount columns and fixing them.
- Checking the logic in EDW and EDW2B and solving the issues if there are any present.
- Developed mappings, sessions, and workflows for the new ETL process followed with Quality standard
- Implemented all the ETL Architecture standards for the new technology Teradata
- Involved in Review meetings during Design & Development
- Involved in Data comparison between the EDW & EDW2B environments.
- Scheduling ETLs according to dependency.