We provide IT Staff Augmentation Services!

Data Reporting Lead Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Over 15 years of experience in Architecting and design ETL solution of a data warehouse using Informatica
  • Architected the complete ETL workflow for Insurance and digital marketing domain
  • Excellent in ETL application performance tuning and database performance tuning by tuning database queries etc...
  • Experienced in identifying areas of improvement by identifying gaps in business process and data redundancy and making recommendations
  • Worked on implementation of Informatica B2B parsers that involved parsing complex source reports in .txt and .pdf format
  • Good understanding of software design/architecture at BI/Data warehousing (OLAP) and OLTP systems and acquainted with the knowledge of design patterns
  • Extensively worked on ETL tool - Informatica Power Center, Pentaho, Pervasive
  • Worked on data extraction, data transformation, data loading, and data quality analysis using tools such as Informatica, Informatica B2B, Oracle, Teradata, Amazon Redshift, MS SQL Server and NETEZZA
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Expert Developer skills in Teradata RDBMS, initial Teradata DBMS environment setup, development and production DBA support, use of FASTLOAD, MULTILOAD, TPUMP, and Teradata SQL and BTEQ Teradata utilities
  • POC on BIG Data-Hadoop
  • Resolve semantic discrepancies in data definitions that arise among multiple sources
  • Good skills in participating in stakeholder meetings and conducting and participating in project meetings among DEV, QA teams
  • Well versed with agile methodology of doing tasks and guiding teams to adapt in agile work methods
  • Excellent scripting experience in Unix Shell
  • Have experience in managing team of more than 14 colleagues
  • CSM (Certified Scrum Master)
  • Excellent interpersonal, analytical skills, communication skills, self-motivated, quick learner, team player, and leadership abilities
  • Getting hands on experience on Power BI, Snowflake, AWS

TECHNICAL SKILLS

ETL Tools: Informatica 10.4.*, Informatica 10.1. *, 9. *,8. *, 7. *, Pentaho, Pervasive, SSIS

Databases: Oracle, MS SQL Server, IBM DB2, Teradata, Netezza, Amazon Redshift, Sybase

Languages: SQL, UNIX, DBMS

Scheduling tools: Autosys, ESP, SAP-CPS

Operating Systems: UNIX, Windows

ERP System: Salesforce, MS Dynamics

PROFESSIONAL EXPERIENCE

Confidential

Data Reporting Lead

Responsibilities:

  • Resolving P1 / P2 / P3 incidents
  • Design solution and permeant fix on recurring P2/P3 incidents
  • Provide RCA for P1 incidents
  • Create end to end solution for new set of development in Informatica
  • Migrated informatica 10.1.1 to Informatica 10.4.1
  • Architected solution for converting stored procedure into ETL
  • Co-ordination with other team members across the globe
  • Support on legacy tool called DT - Studio, GGY Axis
  • Design solution for moving DT Studio code to Informatica

Confidential

ETL Architect

Responsibilities:

  • Onshore coordinator for the team at offshore
  • Extensively used Informatica Power center for extracting, transforming and loading data from relational sources and non-relational sources.
  • Architected the complete Informatica ETL workflow
  • Designed ETL solution for CRM-application Salesforce.
  • Created mappings and workflows for Salesforce Data quality reporting
  • Interacting with the source system users to identify the source system entities and designing the ETL Solution to pull the In-bounds data
  • Used lookups in load jobs to validate the Referential Integrity set up on base object.
  • Resolve semantic discrepancies in data definitions that arise among multiple sources
  • Interacting with users to notify and get the corrections done in order to make sure the well changes are appropriate across the systems and hence making sure the integrity of the data
  • Involve in discussion with source system users to gather requirements for new Inbounds
  • Building the requirement specifications for ETL team to work on
  • Extensively used the transformations Sequence Generator, Expression, Filter, Router, Sorter, Aggregator, LOOK UP (Static and Dynamic), Update Strategy, Source Qualifier, Joiner
  • Worked on data modeling

Confidential

ETL Lead

Responsibilities:

  • Architected and solution to phases like Staging and DWH
  • Provide solution to the client for scope of improvement
  • Responsible for design and developing the critical components in the Informatica Mapplet / re-usable components
  • Coordinate with business users for change request
  • Implemented ETL solution for near real time data
  • Created the data mappings to load Landing Zone data to staging objects
  • Created the Batch Jobs to load the data pertaining to specific Base Objects (includes the loading of corresponding Stage job and Load Job)
  • Created xml of the code developed to use it in other environments for testing i.e. QA and Prod
  • Extensively used Data Cleanse functions in standardizing data in Mappings and created complex user defined cleanse function to reuse wherever possible in mappings
  • Used lookups in load jobs to validate the Referential Integrity set up on base object
  • Involved in setting the Trust rules to use most reliable source as the best version of truth and set the validation rules to choose the right value based on the validation rules if multiple sources data is available
  • Used different types of profiling methods i.e. mid-stream profiling and Join analysis profiling
  • Exported the mapplets created in Developer tool and used them in PowerCenter mappings for data standardization and cleansing.

We'd love your feedback!