- Highly and diversely accomplished analytics and insights professional with a record of increasing responsibility - showcasing advanced analytical, reasoning and problem-solving skills.
- Successfully implemented multiple end to end, scalable, high throughput data integration platforms and acquired rich repository of expertise in all phases of the project lifecycle, which includes - scoping, requirements, design, development, testing and deployment.
- Worked on the software development life cycles - Waterfall and agile methodology (both scrum and Kanban)
- Worked on the below different types of data modelling techniques - which should be applied at different levels based on the data structure and requirement.
- Designed and solved complex ETL scenarios by leveraging the Informatica PowerCenter transformations: Source Qualifier, Joiner, Update Strategy, Connected and Unconnected Lookup, Expression, Router, Filter, Aggregator, rank and Sequence Generator.
- Applied Informatica performance tuning by eliminating source bottlenecks, target bottlenecks, mapping bottlenecks and session level bottlenecks.
- Create deployment groups, users, user groups and manage user access profiles on development, test informatica environments, using the informatica repository manager.
- Transformed and loaded the data using Teradata BTEQ, Teradata OLELOAD.
- Understand the Teradata “EXPLAIN PLAN” and tune the Teradata SQL for optimum performance by defining statistics, analysing the indexes and re-writing the query.
- Rich repository of expertise in the telecom and finance domains - include core banking, consumer banking and wealth management.
- Basic understanding of Big data, Data science, machine learning and Python.
Data Architect, Senior Data modeller
- In my current role, working alongside the enterprise architect, I am Accountable for the following aspects of information management in snowflake db.
- Setting up roles and Access management in Snowflake.
- Creating POC templates for migrating the data from SQL Server (in MS Azure) to Snowflake (AWS), using snowflake internal stages.
- Providing high-level architectural guidance to developers and data modelers on the new snowflake platform.
- Proposed and implemented solutions to ensure referential integrity and eliminate data duplicity - which are not supported in snowflake.
- Articulated data models, reviewed ETL pipelines for consumer banking and core banking source systems. Working knowledge of SnowSQL
- Solely worked and transformed the SQL Server data ware house (4.5TB) objects to snowflake objects and created POC templates for migrating data to snowflake.
- As part of core designee committee member, responsible for formulating the best practices, standards, processes, frameworks and suggest right technology stack so that fully governed, scalable data assets are built for BT.
- Using the informatica Power centre, created re-usable sessions, workflows, mappings and mapplets and created the ETL framework that will streamline the ETL batch processing and facilitates in ease of maintenance and reduces the overall ETL complexity.
- Developed ETL Metadata model that drives ETL framework and data dictionary.
- As part of ETL framework, using the informatica power center and Teradata, developed re-usable informatica workflows that perform data quality checks on the incoming data and perform row count and checksum to ensure data is complete and correct.
- Developed numerous re-usable referential integrity checks among multiple data entities.
- Import the data from multiple system of records (SOR) using Teradata OLE Load.
- Collaborate with business, understand the requirements to develop the following artefacts.
- Interface specification document (System Of Record (SOR) to data warehouse)
- Logical and physical data models along with source to target documents.
- Detailed technical design document.
- Propose Test strategy and contribute to test planning.
- Review and accept the test completion report covering system testing, user acceptance testing.
- Review the handover documents and production run books to ensure the solution is handed over to support team.
- Review the SOX compliance checklist to ensure that the solution is SOX compliant.
- As part of the Talend Proof of concept project:
- Created Talend workflows that import and transform data from various sources.
- Created Talend workflows using different Talend Components.
- Debugged and troubleshooted, different Talend workflows.
- Extract the data from group data warehouse and calculate the RBS (Retail banking system) measures and load them to new GDW using - Teradata control framework (TCF).
- Capture the correct metadata entry into the TCF metadata tables.
- Understand the Teradata explain plan and produce an optimised query that takes less Teradata Tperf. (Tperf is a measure of Teradata traditional performance in the server)
- Performance tuning of the Teradata objects by going through the ‘Teradata Explain Plan’ and selecting the right indexes, re-writing the query and defining the statistics.
- Produce the AutoSys JIL files for scheduling the ETL jobs.
- Attend and contribute to the agile ceremonies, such as - backlog grooming, sprint planning, sprint demo/review and sprint retrospective - to envision the common goal and share team progress appropriately.