We provide IT Staff Augmentation Services!

Senior Teradata Developer Resume

5.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • Competent and techno - savvy IT professional with 15+ years of software experience both in Engineering and Lead roles in Architecture and Solution, Requirements gathering, Data Analysis, Data Modeling, Development, Data transformation, Data warehouse and Master Data Management in Data warehousing across various domains.
  • Expertise in delivering complex technical data warehousing initiatives from inception to delivery, while creating realistic schedules, proactively setting business expectations and risk management
  • Expertise in ETL methodologies supporting data transformations, processing ETL solutions using Teradata, Oracle, Postgres, SQL Server, Informatica, Master Data Management, Informatica data Quality, and for Finance, Insurance and Airline domains for New Data warehouse solutions, Migration and Change requests.
  • Involved in various projects related toData Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
  • Expertise in understanding business requirements, designing, and peer reviews.
  • Around 1+ year of experience in Informatica IDQ and MDM solutions.
  • Experience in Capturing data from existing relational databases (Oracle, MySQL, Teradata) that provide SQL interfaces using Sqoop.
  • Possess specific experience in Health Care, Insurance, finance and Airline’s domain.
  • Experienced in creating Conceptual, logical and physical data model.
  • Implemented slowly changing dimensions - Type 1 and Type 2 in Dimension tables as per requirements.
  • Gained very good knowledge in Data warehousing life cycle, estimations and planning.
  • Experience in Data Analysis, Data Modelling, Data Validation and Data mismatch analysis.
  • Knowledge in creating business views to support data for their Predictive/descriptive analytics.
  • Created Mapping documents by analyzing source files, made enhancements on to the existing models.
  • Responsible to create impact analysis while making changes on existing physical tables.
  • Good knowledge on Normalization/ de Normalization, data extraction, cleaning and manipulation techniques.
  • Experience in generating DDL scripts and creating indexing strategies.
  • Good Understanding on the dimensional and relational concepts. Data modelling like Star, snowflake schema, fact and dimension tables etc.
  • Created Views for business so that they can access Warehouse data into their analytical model to derive business decisions.
  • Understanding existing Data model to make sure all data relations are maintained while analyzing the data Knowledge on creating Presentation layer to fulfill business user requirements and to support their requirement needs.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked with business in creating Cognos reports.
  • Proficient in analyzing data originating from various sources and loading in targets using the data warehousing concepts, into stage tables to ODS using transformations and then into EDW.
  • Experienced in Developing and testing End to End ETL lifecycle of data warehouse applications.
  • Experienced with capturing and documenting requirements from business users, managing project expectations and scope, planning, leadership and organizational skills.
  • Teradata V2R6 SQL Certified
  • Strong ability to lead and motivate project teams to ensure timelines and quality of deliverables.
  • Proven ability to think both analytically and strategically.
  • Interacting with Product owners for feedback and implement comments without delay.

TECHNICAL SKILLS

Databases: Teradata, Postgres, Oracle, SQL Server

ETL: Informatica

Master Data Management: Informatica MDM 10.0,10.1, Informatica Data Quality

Bigdata: Sqoop

Skills: Data Analysis, Data Modelling

Learning Skills: Data Analytics using Python, R

Reporting Tools: Cognos

Other tools: CTRLM, ERWIN, Crystal Reports, Data Flux

PROFESSIONAL EXPERIENCE

Confidential - Irving TX

Senior Teradata Developer

Responsibilities:

  • Analyzed Source files sent by clients for new client implementations, discuss and validate with being part of initial kick off meetings to make sure we have all required data elements for implementation.
  • Worked with client team to understand their data dictionaries, identify data attributes, and model data in accordance with the product.
  • Generate adhoc sql while working with business teams and clients to identify active members, claims costs to make sure we are aligned with client expectations before performing detailed analysis.
  • Performed data analysis on Eligibility and Claims files to identify referential integrity, data patterns and uniqueness between members and claims.
  • Identified PII elements and discussed with business and internal regulatory teams to make sure we comply with regulatory requirements.
  • Created Metadata of column definitions and created columns comments so that the business can refer and understand the column purpose.
  • Worked with Product analyst and IT business owners on understanding the high-level features to translate them into data warehouse requirements.
  • Analyzed source files to understand various scenarios to model data, identify Indexes and create transformation rules.
  • Designed standard stage layer where various clients source data is transformed into standard model, to achieve code refactoring for downstream models with client specific changes while pushing data to applications which reduced the development and implementation time.
  • Created table DDL’s, mapping document along with transformation rules for team to load the data into warehouse.
  • Created complex Presentation layer views using Teradata SQL before loading into SQL Server for the application.
  • Involved in redesigning existing code to handle performance issues, reusability and code merge wherever possible.
  • Worked on Production data issues to perform root cause analysis and deliver solutions to the team with quick turnaround.
  • Worked in agile environment, participated in vertical slicing of the high-level feature into multiple user stories by delivering value through the iterations.

Confidential - Fort Worth TX

Sr. Data Modeler

Responsibilities:

  • Worked with Product analyst and IT business owners on understanding the high-level features to translate them into data warehouse requirements.
  • Worked with product team to understand user interface and identified the data attributes to be sourced into Mosaic Datawarehouse.
  • Created tables in bigdata to load data using ingestion techniques.
  • Source data is loaded to bigdata platform in HDFS Hue tables for user to look at the data even before its normalized and loaded into Datawarehouse.
  • Lead the team modelling session with Business and ETL team, to understand the usage of data from the business perspective and integration with existing subject areas.
  • Analyzed source files to understand various scenarios to model the data and identify Primary keys to load them unique records into tables.
  • Analyzed sample files to come up with Conceptual model and to be in line with requirements to make sure business questions are answered.
  • Performed profiling based on identified data attributes to draw the relationship between tables, unique values mainly to understand the key extraction level based on PNR and Flight Options for each reshop etc.
  • Identified PII elements and discussed with business and internal regulatory teams to make sure we comply with regulatory requirements.
  • Used forward engineering to create a Physical Data Model withDDLthat best suits the requirements from the Logical Data Model
  • Developed Logical and physical models making sure that it can be integrated with existing subject areas to answer business analytics.
  • Setup time with business to go through the parsed source data and had learning sessions to understand how the data should be processed and used for further analytics.
  • Followed naming standards and applied the same to Physical columns.
  • Created Metadata of column definitions, and created columns comments so that the business can refer back and understand the column purpose.
  • Created Data Mapping documents for ETL team with transformations to load into tables.
  • Generated adhoc sql queries to identify the unique values, required values and transformation rules to fetch data from the source system to Bigdata and then to Datawarehouse.
  • UsedErwinfor reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Created table DDL’s, mapping document along with transformation rules for ETL team to load the data into warehouse.
  • Created complex Presentation layer view to business using Teradata SQL, so that business can view the data in single point of reference for their analytics.
  • Data availability to business using views, so that they can leverage data for their analytical model.
  • Worked in agile environment, participated in vertical slicing of the high-level feature into multiple user stories by delivering value through the iterations.

Confidential - Fort Worth TX

Sr. Data Modeler/Analyst

Responsibilities:

  • Worked with Product analyst and IT business owners on understanding the high-level features to translate them into data warehouse requirements.
  • Worked with product team to understand user interface and identified the data attributes to be sourced into Mosaic Datawarehouse.
  • Interacted with business owners to identify attributes to be sourced into warehouse to fulfill need of business analytics.
  • Analyzed source files and presented data scenarios/irregularities to business and reported back to the product team for their analysis.
  • Interacted with the source team to create business column definitions, metadata and identify relationship between source tables.
  • Using source data identified data patterns for sample origin and destination to understand current reroute options that system provides using AI to that of the customer choses.
  • Data profiling is done based on identified data attributes to draw the relationship between tables, unique values mainly to understand the key extraction level based on flight/PNR/Passenger etc.
  • Identified PII elements and discussed with internal regulatory teams to make sure we comply with regulatory requirements.
  • Performed data analysis to integrate data source data with data available in warehouse, to make sure business questions are answered.
  • Performed Data Modelling by creating tables, identifying keys, Nullability etc. using modelling techniques.
  • Followed defined naming standards, security in terms of table/view access and retention architecture.
  • Created business rules for ETL team to make sure data loaded into the warehouse maintains the relationship, RI check.
  • Created table DDL’s, mapping document along with transformation rules for ETL team to load the data into warehouse.
  • Created complex Presentation layer view to business using SQL, so that business can view the data in single point of reference for their analytics.
  • Data availability to business using views, so that they can leverage data for their analytical model.
  • Worked in agile environment, participated in vertical slicing of the high-level feature into multiple user stories by delivering value through the iterations.

Confidential - Carrollton TX

Data Analyst

Responsibilities:

  • Studied in-house requirements for MDM requirements and converted them into business requirement documents.
  • Worked with business analysts for gathering requirements and Performed Analysis on Business Data.
  • Interacted with Business to derive decision on which sources will have highest priority to identify the golden record.
  • Reverse Engineered the staging data model from the existing data warehouse.
  • Analyzed the project requirements and modelled master data management tables.
  • Created dimensional model for the reporting system by identifying required dimensions and facts.
  • Develop logical and physical data models with the goal of balancing optimization of data access.
  • Responsible for defining the naming standards for data warehouse
  • UsedErwinfor reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Analyzed metadata repository options and made key recommendations towards the selection of MDM tool.
  • Created mappings from Source tables to the land table of MDM DB.
  • Created cleanse rules, Match and merge rules
  • Validated the mappings and data in land and Stage tables of MDM DB which has various cleanse functions using SQL queries
  • Created jobs in Informatica ETL Tool for loading the data from all POS system to the land table of MDM.
  • Coordinated with functional users for User Acceptance Testing prior to releasing system changes in production.
  • Prepared data for both postitive and negative scenarios to showcase merge rules.
  • Worked on Agile methodology and have used TFS .
  • Prepared status summary reports with details of project status.
  • Participated in regular project status meetings

Environment: Informatica MDM, Informatica IDQ, Informatica 9.6, Erwin, Oracle 11G, Netteza, Unix, SQL Server, Microsoft test manager

Confidential - Dover NH

Informatica/ Teradata Developer

Responsibilities:

  • Created ETL mappings which were developed to load data from Oracle sources into data warehouse Stage and ODS and then to EDW.
  • Understand business /transformation rules which are involved in developing the source to target mappings, write Complex SQL queries to reflect rules and create mappings to load the data in target table.
  • Performed data analysis to handle the data scenarios in the coding even before progressing into higher environments.
  • Create reports generated by Cogos including Dashboard, Summary report, Master detailed and Drill down reports.
  • Created test data for the scenarios where scenarios are not available from the test files.
  • Written complex SQL in Teradata for data transformations.
  • Created mappings to identify type 2 changes and load only the records which has actual changes.
  • Validate source and target databases with data extraction, transformation and loading activities.
  • Assisted in data validation, data completeness, incompatible, duplicate data and various other testing challenges between source and target using complex SQL queries.

Environment: Informatica Platform 9.5, Teradata, Cognos, UNIX, Data flux, Oracle 11G, SQL Server

Confidential

Programmer Analyst

Responsibilities:

  • Responsible to understand current Reporting Landscape & Process.
  • Responsible to Identify Issues/ Pain points from the current process to the end user’s.
  • Responsible to understand the Business Need by analyzing the key management report and to create mappings for the reporting elements.
  • Responsible to understand business requirement for adhoc Reports and to provide assistance in design and structure for REG data/ reporting architecture.
  • Responsible to provide areas of recommendations along with the benefits, challenges if any with the options in phased approach.

Environment: Cognos, Oracle, Data Analysis, Eagle Pace, UNIX

Confidential

Senior software engineer/Developer

Responsibilities:

  • Requirements’ Gathering with Business Users for the New CR’s raised.
  • Identifying the gaps in the new requirement and performed the mapping activities.
  • Provided the effort estimates for the new requirements.
  • Interacted with the business users in understanding the business requirement, and interacted with concerned parties to provide solutions which satisfied their requirement.
  • Prepared Downstream Mapping for Data mart in accordance for new requirements.
  • Was responsible for documenting the SDLC documents as per the standards.
  • Developed ELT (Extract, Load & Transform) scripts using Teradata & Sunopsis for Loading Data into Enterprise Data Warehouse (EDW).
  • Perform System Integration Testing (SIT) & Supported User Acceptance Testing (UAT).
  • Supported for Production Deployment & Ongoing BAU (Business As Usual) to ensure that all the files have been pushed to Fermat for further processing.

Environment: Teradata, Informatica, Control-M, UNIX, Oracle

We'd love your feedback!