We provide IT Staff Augmentation Services!

Amazon Web Services - Technical Consultant: Resume

5.00/5 (Submit Your Rating)

Miami, FL

PROFESSIONAL SUMMARY:

  • Having 10 years of experience in data warehousing systems in which 7 years of experience in Teradata and 3 years of experience OLTP/OLAP systems.
  • Worked as Teradata Solution architect/Teradata Tech Lead/ Data modeler/ Senior Teradata/ MPP Database Developer in various projects.
  • Working on Amazon Web Services (AWS) cloud environment for data warehouse.
  • Working on migration from sql server 2012 to AWS Redshift database project.
  • Analyzing the data with business requirements and preparing the mapping sheets for development purpose.
  • Experienced with different MPP databases like Teradata, Redshift and knowledge in Green Plum and Netezza Databases.
  • Space Allocation - Assigning Permanent Space, Spool Space and Temporary Space.
  • Access of Database Objects - Granting and Revoking Access Rights on different database objects.
  • System Performance - Use of Performance Monitor ( Teradata VIEWPOINT), Priority Scheduler and Job Scheduling.
  • Resource Monitoring - Database Query Log (DBQL) and Access Logging.
  • Responsible for maintaining all DBA functions (development, test, production) in operations 24×7.
  • Performance tuning, including collecting statistics, analyzing explains & determining which tables needed statistics. Increased performance by 35-40% in some situations.
  • MULTILOAD, BTEQ, FAST LOAD and TPUMP scripts has been created. Also, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databases etc
  • Worked on creating and managing the partitions. Also, performed database health checks and tuned the databases using Teradata Viewpoint.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, and MULTI LOAD, FASTLOAD and Mainframes and Data stage.
  • With the in-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed
  • Redistributions, join order, optimizer statistics, physical design considerations (PI and USI and NUSI and JI etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance.
  • Acted as a liaison between the offshore and onsite teams.
  • Designing Teradata objects and identifying corresponding roles and grants.
  • Preparing implementation strategies for Teradata objects for production enhancements
  • Good at documenting the code and preparing review docs.
  • Tuned most resource consuming SQLs by re writing the Teradata queries.
  • Profiling the Data for Column level Compression and Identifying columns for daily/weekly stats collection.
  • Managing the users and running the scripts for maintenance through Teradata Viewpoint.
  • Solving production Issues with the Teradata System and coordinating with NCR people for resolutions.
  • Involved in NODE up gradation processes and also Teradata version upgrade from 12.0 to 13.10.
  • Suggesting USI/NUSI/PPI/PI/NUPI and implement them where ever required.
  • Prepared physical data model for some of the datamarts (Customer, Flight and Schedule Managements) through Erwin Tool.
  • Analyzing the data with business requirements and preparing the mapping sheets for development purpose.
  • Experienced in Interacting with Users, analyzing client business processes, Documenting business requirements, and Performing Design Analysis and Developing Design specifications.
  • Very good expertise in Teradata optimization Techniques.
  • Defining data warehouse architecture layers ( Staging/ODS/SRD/Meta/Base) tables models
  • Experienced in Automating and Scheduling the Teradata SQL Scripts.
  • Experienced working on large volume of Data using Teradata SQLs and manage the data.
  • Handled 60 TB and 18 TB data warehouse Systems.
  • Experienced in writing Design Documents, System Administration Guides, Test Plans & Test Scenarios/Test Cases and documentation of test results.
  • Experience in Data Warehousing Architecture & Technology including Data mart and Data Mining using Logical Dimension Modeling (Star Schema / Snow Flake Schema/Hybrid), Facts, Dimensions, SCD, and Staging Area/ODS.
  • Penetrating insight in harnessing the inherent parallelism of Teradata with other applications/ETL Processes/Load Operations and Queries.
  • Skilled in ETL Processes, SQL Query Tuning, Database Performance Monitoring, Teradata SQL, Macros.
  • Extensive Experience in PDCR database on DBQL queries.
  • Analytical problem-solver, able to anticipate issues and create new systems that streamline operations, resolve concerns raised by business and improves efficiency of the team work.
  • Directed the planning, design, production and management of Data warehousing applications.
  • Myself conducted many class room trainings on Teradata technologies in Techmahindra.
  • Assess project issues and identify solutions to meet productivity, quality and customer goals.
  • Excellent in the preparation/reviewing if SOW and HLA preparation for SMALL to LARGE scale enhancements.
  • Proposed solutions to migration (Mainframe to Data stage) projects.
  • Published white papers on data modeling and implementing data warehouse in cloud environment.
  • Reviewing SOW and prepared estimations for small and medium sized enhancement projects 50 .

TECHNICAL SKILLS:

BI Tools: Business Objects 6.5.1,Cognos 8.4.1,BRIO, HP Yotta Reporting Tool, Tableau

ETL Tools: Data stage 7.5.1, Informatica 7.1, SSIS packages

Modeling Tools: Erwin 7.1

Scheduling Tools: SLJM,AUTOSYS,ESSPRESSO,TQS,TIDAL SCHEDULAR,JENKINS, TWS

GUIs: Visual Basic

Databases: Teradata 12.0/13.10/14.0,Oracle,AWS Redshift,Sql Server 2005,HP Neo View DB

Operating Systems: Windows, Unix, Mainframes

Mgmt. Tools: Excel

Languages: C, PL/SQL, Pro C, Python, Shell Scripting

ITIL Tools: OVSD (Open View Service Desk), HPSC, Clarify, Remedy, Perigrine ESM

Integration Tools: PVCS, Jenkins, Stash, GIT

PROFESSIONAL EXPERIENCE

Confidential, Miami, FL

Amazon Web Services - Technical Consultant:

Responsibilities:

  • Gathering requirements from business users to understand the data models.
  • Preparation of the mapping design documents.
  • Develop different data models like orders, products and subscriber base.
  • Validation of the data in the data models
  • Define the overall data warehouse architecture ( ETL process, ODS)
  • Define technical requirements, technical and data architectures for the data warehouse
  • Recommend/select data warehouse technologies (ETL, DBMS, Data Quality, BI)
  • Define production release requirements and sustainment architecture.
  • Expert level working with relational and MPP databases (Sql Server and AWS Redshift).
  • Experience in Python orchestration transform and load and unloading the data into dimension and fact tables.
  • Experience in GIT/Jenkins/STASH Tools to Integrate the Source code.
  • Developed the control totals for audit purpose.
  • Fixing development/Production issues in different models.
  • Interacting with business analysts and understand the requirements to design and develop the different subject models in AWS Redshift data warehouse.
  • Extensive experience working with ETL process and ETL toolsets dealing with large-data sets
  • Extreme familiarity with data warehousing solutions such as Amazon Redshift.
  • Develop ETL processes of structured and unstructured data from various production sources to our centralized data warehouse.
  • Troubleshoot any performance issues in AWS Redshift and ensure optimal pipeline efficiency.
  • Proactively monitor and identify data discrepancies to ensure data quality, consistency and integrity in AWS Redshift Data warehouse.
  • Developing control Totals (Audit Schema) for different data models to match the source files and record count.
  • Writing SQL queries in the development of different models.
  • Monitoring queries in AWS console to find the performance and behavior of the queries.

Environment:Win 2010, JIRA, Jenkins, Stash, GIT, AWS, Redshift, Tableau, Unix, DB fit, SQL Server 2010

Confidential

Teradata Solution Architect/Data Modeler

Responsibilities:

  • Find/generate executive level support for the data warehouse initiative
  • Define key business drivers for the data warehouse initiative
  • Deliver a project scope that directly supports the key business drivers
  • Define the overall data warehouse architecture (e.g., ETL process, ODS, CDW, Data Marts)
  • Define technical requirements, technical and data architectures for the data warehouse
  • Recommend/select data warehouse technologies (e.g, ETL, DBMS, Data Quality, BI)
  • Direct the discovery process.
  • Design and direct the ETL process, including data quality and testing
  • Design and direct the information access and delivery effort for the data warehouse
  • Define and direct the implementation of security requirements for the data warehouse
  • Define meta data standards for the data warehouse
  • Direct the data warehouse meta data capture and access effort
  • Define production release requirements and sustainment architecture.
  • Developed BTEQ,MULTI LOAD, FAST LOAD, TPUMP scripts.
  • Optimized batch jobs and queries with Teradata Data optimization techniques.
  • Resource Monitoring - Database Query Log (DBQL) and Access Logging.
  • Viewpoint monitoring and finding the system status.
  • Involved in NODE up gradation processes and also Teradata version upgrade from 12.0 to 13.10.
  • Worked on long running batch jobs and improved performance of the batch jobs/queries.
  • Participate in projects to support development staff in their implementation of the model
  • Integrate newly sourced data into a common data model

Environment:Win 2010, SUSE Linux, Teradata 13.10,Datastager 7.1,Cognos 8x, Mainframes, Sql Server 2005,TWS, Erwin 7.1

Confidential

Teradata Technical Lead:

Responsibilities:

  • To provide the case information to the clients and discuss with them so that we can lead to a solution for that defect.
  • Tested and implemented the proposed change in the macro which would remove the data corruption.
  • Bug Fix and script enhancements. Fixing operational issues of the jobs to ensure data currency up to date and meeting the business requirements. Resolving Application dockets with high priorities raised by business users and end users.
  • Developing multi Load and fast load Scripts to load the data into Warehouse.
  • Effectively serviced Business users catering to their requirement vis-à-vis essential data delivery and quality issues.
  • Extensive analysis/use of various Teradata Utilities like Fast Load, BTEQ, MLOAD.
  • Involved in complete cycle of the knowledge transfer.
  • Active interaction with business users, participating in onshore offshore conferences with the client, downstream business users and other vendors under consideration.
  • Developing and modifying the macros (consisting of Set of SQL Scripts) involving various transformations as per the Business Logic and client requirement.
  • Defect fixing if any issue persists in coding after the code deployment to production as part of business readiness testing.
  • Extensively worked on incident and problem management issued.
  • Execution of Batches to load data into Data warehouse using Teradata macros and MLoad in development/test environment.
  • Loading data from various data sources and legacy systems into Teradata Warehouse production and development warehouse using MULTI LOAD and F-LOAD.
  • Developed MultiLoad, Fast Load, Fast Export and Query Jobs in UNIX environments.
  • Production support and maintenance of the IDW Applications/Data Marts.
  • Analyzed and fixed few macros to resolve the data defects raised.
  • Fixing the job (Batch) to complete its execution whenever it fails.
  • Developing and testing macros for new batches as per business requirement.
  • Implemented and Tested the Kit (for code deployment in production) in the Development Server.
  • Reviewing all the deliverables and ensuring timely and defect free delivery.
  • Interaction between team members, Client IT Department & Business User.

Environment: Win XP, Teradata DB and Teradata FLOAD, MLOAD FEXP BTEQ, Unix MP RAS, SLJM, ESSPRESSO, Congos 8.4.1

Confidential, TX

ETL - Lead Developer

Responsibilities:

  • A suite of Application SME and provide L3 - L4 Support to many BIDW applications.
  • Resolving Day to Day production issues and providing the permanent solutions to recurring issues.
  • Involved in developing the metadata scripts for the new sources.
  • Review and manage the change requests and move the change to production.
  • Involved in testing the code in UAT before moving to production.
  • Creating Unit test cases.
  • Developed and implemented quick workarounds using sql to fix the data discrepancy issues during UAT sign-off.

Environment: Oracle 9i/10g, HP-UX, PL/SQL, BO XI, Informatica 7.0, Toad, Tidal Scheduler/HP Neoview Db.

Confidential

Datastage Developer

Responsibilities:

  • Designed Data Stage ETL jobs for extracting data from text file and oracle loaded into Siebel base tables.
  • Design the data stage jobs to load the data into data warehouse.
  • Designed Multiple Sequencer to group the jobs.
  • Designed Jobs are moved to Development and production.
  • Worked on production analysis to analyze the daily process flow of the batch Hierarchy. For Developed Jobs need to provide the support in warranty.
  • Developed Server jobs using stages OCI, ODBC, Transformer and Hash and Sequential File Stages etc.
  • Involved in inbound interface monitoring and outbound interface monitoring.
  • Identify other source systems like Oracle, their connectivity, related tables and fields and ensure data integration of the job.
  • Worked on different Help Desk cases to resolve the issues raised in Production batch hierarchies
  • Worked on different environments like STAR P, STAR D,STAR U, STAR T.
  • Developed Server jobs using stages OCI, ODBC, Transformer and Hash and Sequential File Stages etc.
  • Troubleshooting the designed jobs using the Data stage Debugger
  • L3 and L4 support on sql and PL/SQL.
  • Tuning Data Stage transformations and jobs to enhance their performance.
  • Working as Manager - Projects in IGATE Global Services Oct 2014 to Till Date.
  • Worked in Tech Mahindra as a Project Lead from Mar 2012 to Oct 2014.
  • Worked in Confidential as a Technical Team Leader from May’06-Mar’12.
  • Worked Accenture Services as a Software Engineer from Jan‘06 to May’06

We'd love your feedback!