Senior Data Analyst/consultant Resume
SUMMARY:
- Senior Consultant with over 12 years of experience in Teradata BI/DW
- Experienced Big - Data Hadoop developer
- Experienced in Hadoop tools Map-Reduce /Hive/PIG/ Sqoop/ etc.
- Experienced in Teradata development in a Banking domain
- Experienced in design/implement/deployment WhereScapeRED
- Experience in Base SAS programming, SAS Macros, Procedures and well versed on ETL with SAS
- Experienced in Architecture, design, development, maintenance, implementation of Teradata warehouses
- Expert in Data Analysis, Data Mining, Data processing, Data Management and working with data-warehouses
- Expert in Data Manipulation and efficient data extraction on databases
- Worked on designing and developing new processes and improving existing.
- Experience in analyzing business requirements and translating requirements into functional and technical design specifications
- Extensively worked in various Teradata Client Utilities like Multiload, Fast load, Fast Export, Tpump and Teradata administrator activities
- Knowledge on Financial / banking LDM experience
- Experienced in using Erwin
- Experienced in Data Warehousing concepts, E-R model & Dimensional modeling (3NF), like Star Schema, Snowflake Schema, and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes
- Experience of Logical and Physical Data Modelling for Teradata data warehouses
- Well versed with Teradata RDBMS installation, upgrade/migration, and configuration, Teradata performance tuning issues
- Experienced in understanding of Teradata Design Best Practice
- Expert in SQL programming like stored procedures, Triggers, Macros
- Experience with very large Data warehouse (300TB/120TB/17TB/ 15 TB)
- Experience in dashboard data provisioning/dashboard design using Tableau
- Expert in data profiling, cleansing, monitoring business rules and data quality
- Expert in Teradata TD14.10/TD14/TD13.10/TD13/TD12/V2R6/V2R5
- Expertize in Teradata various tool like ETL/ELT, Monitoring and reporting
- Experience in Designing and Developing Semantic Processes
- Experienced in End to End Project life cycle
- Expertise in working on the Data Quality and Data Analytics.
- Expert in understanding Performance Impact of Design Decisions
- Experienced in handling databases like Oracle/DB2.
- Expert in ETL development with Informatica
- Expert in using OLAP/OLTP tools/systems/functions
- Experience in Unix shell programming.
- Experienced in Scripting and testing experience
TECHNICAL SKILLS:
Databases: Teradata, DB2, Oracle, MS-Access.
Tools: Teradata tools, utilities, SQL, TPT, and Teradata ETL.
Data Visualization: Tableau.
ETL Tools: Informatica, WhereScapeRED
Microsoft Tools: MS-Word, MS-Excel
Bigdata Tools: Hadoop, Map-Reduce, Hive, Sqoop, PIG.
Languages known: C, C++, SQL, Base SAS, VBA
Operating Systems: Dos, Windows, Unix, Linux, Mp-Ras
PROFESSIONAL EXPERIENCE:
Confidential
Senior Data Analyst/Consultant
Responsibilities:
- Technical consultant for design/development/Testing of the modules implemented.
- Work closely with the business on converting the Functional specifications into the Technical design documents.
- Redesigned the delta processing for various master data objects to eliminate the out of sync and have issues between cross systems.
- Worked on source to target mapping documents for various functional requirements.
- POC for planning system retrofit in WhereScapeRed Environment.
- Implemented multiple framework’s in Teradata which can support dashboard’s designed in NetWeaver.
- Design/Development of dashboards (Data Quality) using PHP/java/html
- Forecast analytics for demand planning.
- Worked on various integration deliverables dealing with cross systems.
- Involved in Production Support and Performance tuning activities during pre-post production activities.
Environment: Teradata, SQL, Teradata Administrator, BTEQ, Informatica power centre, Oracle, WhereScapeRed, Microsoft Visio/Visual Source
Confidential
Senior Consultant (Developer)
Responsibilities:
- Technical consultant for execution of this project.
- Involved in transformation of business rules into technical specifications.
- Development of the code logic in Teradata and WhereScapeRed .
- Performance tuning in the semantic layer
Environment: WhereScape RED, Teradata, SQL, Teradata Administrator.
Confidential
Senior Consultant (DBA/ Developer/Support)
Responsibilities:
- Technical consultant for execution of this project.
- Performed DBA activities on the UAT/ DEV and PROD environments.
- DBA activities involved in the creation and access requirements of the objects.
- Development of the Staging Environment and History repository for CCA
- Automation of the Jobs in unix shell scripting.
Environment: Teradata, SQL, Teradata Administrator, BTEQ, Unix, Crontab, Shell Scripting, ETL.
Confidential
Senior Consultant
Responsibilities:
- Technical consultant for execution of this project.
- Designed the logical and physical data model for the phone shop module.
- Developed the staging environment, history repository maintenance, star schema data model and the semantic layer in WhereScape RED Environment.
- Mapping documents and ETL load using Informatica.
- Performed End to End testing of the phone shop module.
- Deployment of the phone shop module from UAT to DEV and from DEV to PROD Environment.
Environment: WhereScape RED, Informatica, Teradata, SQL, Teradata Administrator, BTEQ, Excel, ETL.
Confidential
Senior Consultant / SME
Responsibilities:
- Technical consultant for execution of this project.
- Worked on Teradata TPT scripts for the export import of the dictionary tables from one system to the other.
- Developed the BTEQ scripts for identifying the objects out of sync.
- Developed recursive functions to identify the dependencies of the objects
- Developed the automation of the jobs through Unix Shell Scripting.
Environment: DBA, Teradata, SQL, Teradata Adminstrator, BTEQ, Unix Shell scripting.
Confidential
BI Developer
Responsibilities:
- Technical Lead for execution of this project.
- Installed cloudera/apache Hadoop framework on Linux system.
- Configured tools like Hive/PIG/Sqoop on the Hadoop Server.
- Data movement from warehouse to Hadoop server using Sqoop.
- Data Analysis and Data mining on the Chat Text.
- Developed map-reduce programs to perform the above analytics defined.
- Developed hive/PIG scripts to perform the transformations on consumer chat text
- Developed UDF in Hive/ PIG for few operations on Chat Text.
- Generation of the final Aggregated reports as per the Business requirement.
- Created dashboards using Tableau.
Environment: Hadoop, MapReduce, Hive, Sqoop, PIG, Teradata, SQL, ETL, Tableau.
Confidential
Responsibilities:
- Lead for execution of this project.
- Data movement from source system to Target system using Informatica ETL
- Prioritizing the existing rules defined in the system.
- Applying the new rules to the transactions.
- Generate the Rule Matrix and resolve the overlapping issues.
Environment: Teradata, ETL, SQL, Unix Shell Scripting
Confidential
Responsibilities:
- Designed Logical and physical model for a Metadata system in bank called THOR
- Designed ETL jobs for the model THOR using Teradata Mload and Bteq
- Designing archive and purge jobs and participated in perm space saving activities using Teradata Multi valued compression techniques
- Involved at performance tuning of complex SQL reports
- Designed LDM and physical data model to evaluate the Data quality of THOR called DAQQUERY.
- DB System capacity planning
- Issues resolution and presented many issue resolutions with Business users
- Designed and Created data cleansing, data conversion, validation and External loading scripts like MLOAD and FLOAD for Teradata warehouse using Informatica ETL tool.
- Executing SAS Scripts to bring the data from Various data sources on to SAS platform.
- Handled a team of 6 people from vendor side consultants and maintained daily meetings on issue resolutions
- Creating dashboard data provisioning/dashboard design using Tableau
- Working with adhoc and complex SQL queries with bad performance. Fine tune them to make sure that Queries should take least time with less spool space
- Creation of DDLS, DMLS as per the design requirements
- Need to check the data distribution is even across all amps in the DB system
- Data profiling, cleansing, monitoring business rules and data quality using Trillium Software tool.
- Worked in the scheduling of Teradata ETL jobs with windows batch files
- Designing ETL jobs as per business requirements and preparation of design documents
- Designing Reporting queries per the end user requirements with highest degree of performance.
- Automating scripts which are required for Bank analytical needs
- Mentoring junior team members in all technical concepts
- Gathered Requirements and the reporting issues from the End user
- Business meetings with stakeholders.
Environment: Teradata, Informatica, ETL, SQL, MS Excel, Tableau, SAS
Confidential
Responsibilities:
- Lead for execution of this project.
- Data Analysis and Data Mining of various information of data products present in the Bank
- Involved in Information Architecture and Data Modeling.
- Data movement of identified information from source system to the MSAccess.
- Usage of ETL tools like Informatica to perform the ETL operation from source systems to Target.
- Data Migration from Oracle/Sql Server/DB2 to Teradata System
- Consolidation of the data on the warehouse.
- Build a GUI application on MS Access using VBA to interact with the underlying tables.
Environment: Teradata, SQL, Oracle, DB2, MSAccess DB, VBA, Tableau, Base SAS
Confidential
Responsibilities:
- Technical Lead for execution of this project.
- Design and Implementation of the project from END to END
- Data Mining and Data Analysis of the data marts at the OLTP system.
- Information Architecture Design, Data modeling and design for BI
- Identify the data marts present in Teradata/Oracle/DB2 OLTP Hubs.
- Data Migration from Oracle and DB2 systems to Teradata system.
- Building of the Staging Data mart tables in the Teradata data warehouse.
- Perform ETL operation for transferring the data from the source system to the target data warehouse environment using Informatica, SSIS and others.
- Build the data process algorithms in SAS Environment.
- Perform complex transformation process on temporary data set
- Identify the Data Quality issues and reconciling them.
- Generate the Aggregated report on the final data mart.
Environment: Teradata, Oracle, DB2, SAS Prog language. Teradata utilities, Informatica, SSIS, ETL, SQL, Erwin.
Confidential
Teradata Database developer and Engineer
Responsibilities:
- Senior Team member in the Sustaining Team.
- Handled issues coming up in Resolver, Parser and Optimiser modules.
- Sound knowledge of Teradata internal Architecture and design.
- Involved in Teradata RDBMS installation, upgrade / migration, and configuration, tuning and diagnosing performance issues.
- Expert in SQL development like stored procedures, Triggers, Macros etc.
- Good understandings of Teradata load/unload utilities like Tpump, FastLoad, MultiLoad, FastExport and TPT
- Good knowledge of database monitoring, Teradata Tools & Utilities.
- Expert in performance tuning and rewriting of SQL’s.
- Involved in DBA Activities like Performance monitor, Scheduling, Teradata Manager activities.
- Handled Archive and Restore jobs from one version to other.
- Involved in Designing and Developing Semantic Processes
- Handled Performance issues which can Impact of Design Decisions.
- Worked on various diagnostic modes at system level and session level.
- Worked on Data Dictionary tables.
- Advanced expertise in Teradata Application Utilities and Database Administration Utilities viz., BTEQ, Teradata SQL Assistant, Teradata Administrator, Teradata Pmon, Teradata Manager, Teradata Visual Explain.
- Experienced in trouble-shooting techniques, tuning SQL statements, Query Optimization, Dynamic SQL and Materialized Views.
- Technical expert in the areas of relational database logical design, physical design, and performance tuning of the RDBMS.
- Excellent in performance tuning in Teradata.
- Sound Knowledge about Indexes (JOIN Indexes, Primary Indexes and Secondary Indexes) and other features of Teradata.
- Sound Knowledge of Data Warehousing concepts, E-R model & Dimensional modeling (3NF), like Star Schema, Snowflake Schema and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.
- Strong Technical Writing and user documentation Skills.
- Very good experience in Semantic Data modeling, which includes creation of complex views over the base tables for reporting purposes.
- Experience in creating and handing PPI, UPI, NUPI, NUSI and USI.
- Involved in testing of the fix in the Teradata patch modules
- Created BTeq Scripts for Tesing of the new modules.
- Involved in Derived Tables/ Volatile Creations.
- Expert in writing UDF and Implementing them.
- Involved in Complex Query writing/SubQuery’s, Correlated SubQuery’s for seeing the behavioral patterns in new releases.
- Involved creation of Join Indexes/Materialized views for testing the UDD Feature
- Involved in writing code for reusage of Memory segments as part of NeoMemoray Manager implemented in TD12.
- Resolved 100’s of issues in Resolver as part of QRW feature in TD12.
- Handled issues related to views getting spooled and folded in TD12.
- Expert in handling the Explain Plan and Join plan issues.
- Involved in writing queries and stats collection and seeing the Explain plan.
- Well versed with the best practices involved in writing of SQL’s
- Handled tasks in releases from V2R5 to TD14 and sound knowledge in major features like CollectStats, NOPI, Scalar SubQuery, TPT etc.
Environment: Language: C/C++Database: Teradata V2R5/V2R6/TD13/TD13/TD14.0 Teradata DBA Tools, Teradata Visual Explain, BTEQ, TSET, Teradata SQL
Confidential
Software Engineer
Responsibilities:
- Creation and execution of Teradata Load Job Scripts on BDW Environment.
- Handle the issues coming up in the Teradata load jobs.
- Worked on Teradata V2R5 Environment.
Environment: Teradata, BTeq, Teradata Tools and Utilities.