We provide IT Staff Augmentation Services!

Teradata/hadoop Database Architect Resume

4.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY

  • He has over 15 years of experience on Teradata/Hadoop Database Architect
  • Extensive experience in all the aspects of administering and managing the Oracle, Sql Server, PeopleSoft, Hadoop and Teradata production, test and development systems.
  • Actively involved in project management phases of SDLC like design, development, production reviews and migrations.
  • Expertise in implementing EDW, ODS and Real Time solution in Teradata.
  • Expertise in validating Unix/Linux Scripts, SQL Scripts, PL/SQL and Stored Procedures to ensure the successful execution of the ETL processes.
  • Having knowledge on Data mover, Unity, Query grid and DSA.
  • Experienced in the field of IT related to Business Intelligence concepts like Relational Database Management, Design, Development, Data Migrations, Implementation and Project Management.
  • Having good experience in IT Industry Administering Teradata development, testing and production databases, implemented on MP - RAs and Linux server.
  • Extensive experience in all the aspects of administering and managing the Oracle, Sql Server, PeopleSoft, Hadoop and Teradata production, test and development systems.
  • Actively involved in project management phases of SDLC like design, development, production reviews and migrations.
  • Expertise in implementing EDW, ODS and Real Time solution in Teradata.
  • Having good experience in optimization, performance enhancement and Procedures for enhancing the load performance in various schemas across databases.
  • Proficient in ETL (Extraction, Transformation, Loading) utilizing Informatica, Golden Gate, Data stage and O2T jobs.
  • Extensive working experience as Application database administrator involving the UNIX administration, Storage management and other client servers.
  • Strong experience on Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Strong Knowledge in Teradata Capacity planning and Teradata VM’s on VMware, Lab Manager and Intelli cloud, Teradata dual active systems and data replication activities.
  • Advanced expertise in Teradata SQL, Teradata DBA activities like database maintenance activities, Tuning ETL queries and application code, Teradata user and database creation/maintenance, Space management, Releasing the locks, Performance monitoring using PMON, Teradata Manager, Viewpoint and using API’s, Workload Management using TASM, TIWM, PS.
  • Expertise in validating Unix/Linux Scripts, SQL Scripts, PL/SQL and Stored Procedures to ensure the successful execution of the ETL processes.
  • Having knowledge on Data mover, Unity, Query grid and DSA.
  • Highly skilled in writing SQL queries, Stored Procedures, Cursors and Triggers.
  • Having experience in implementing real-time and Web applications in Teradata.
  • Experienced in preparing design documents and writing Quality center build scripts and executing them in QC.
  • Excellent interpersonal, communication skills with the ability to manage multiple projects and meet deadlines.
  • Good knowledge of code, design reviews and other process oriented procedures.
  • Good understanding of Data warehouse Life Cycle.
  • Good Understanding of Dimension Modeling, which is adopted to build Data warehouse.
  • Well versed experience on Star Schema, Snowflake Schema and Data Vault.
  • Managed and updated the Technical Specification Documents.
  • Widely used the Informatica Power Center Client Tools like Designer, Workflow Manager, and Workflow Monitor & Repository Manager.
  • Experience in Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins,
  • Experience with disaster recovery and business continuity practice with Hadoop clusters.
  • Extensively used Major transformations using Power Center Designer like Source Qualifier, Lookup, Expression, Aggregator, Update Strategy, Normalizer, Sequence Generator, etc…
  • Well versed exposure on RDBMS like Oracle, SQL Server, PeopleSoft and as well as External Sources like XMLs...
  • Expert knowledge of backup and recovery strategies. Good knowledge of other RDBMS technology.
  • Proven track record on successfully completion of Long Term and Short term projects within deadline while adapting to rapidly changing environments and priorities.
  • Expertise in UNIX scripting.
  • Experience Managing Cloudera Hadoop(CDH), Cloudera Manager, HDFS, Yarn, MapReduce routines and Spark jobs.
  • Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions.
  • Advise software development teams on architecting and designing web interfaces and infrastructures that safely and efficiently power the cloud environment.
  • Excellent communication and interpersonal skills, quick learner, self-motivated, ability to work individually and as well as in a team environment with good team spirit.
  • Excellent Analytical, Problem solving, communication and leadership skills.
  • Experience with Unix shell scripts, such as ksh, csh, bash or sh. Experience with scripting languages, such as Perl or Python.
  • Worked with data delivery teams to setup new Hadoop users. includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Teradata certified.

TECHNICAL SKILLS

Operating Systems: MP-RAS, Linux, Solaris, AIX, Windows XP/NT/2000

RDBMS: Teradata TD15, TD14 TD13, TD12, V2R6, V2R5, Oracle 8i, 9i, 10g, SQL Server 2005, Aster Big data, PeopleSoft, Hadoop, couch base, Hortonworks, cloud era, mongo dB, Cassandra, Snowflake, Vertica

Cloud: AWS, Microsoft Azure, Informatica Cloud, Teradata intellicloud

Administrator Tools: Teradata Manager (TMGR), Teradata Administrator (Winddi), Teradata SQL Assistant (Query man), Teradata Dynamic Workload Manager (TDWM), Teradata Query Scheduler, Teradata Performance Monitor (PMON), TSET, Priority Scheduler (PS), Teradata Query Director, Query Grid, Data Lab, Unity, Data Mover, Viewpoint, TASM

Performance Analysis Tools: Teradata Visual Explain, Teradata System Emulation Tool, Teradata Statistics Wizard, Teradata Index Wizard

Database Utilities: Schmon, Tdwmdmp, CTL/XCTL, Lock disp, Show lock, Qrysessn, Vproc manager, Ferret, Pack disk Scandisk and Checkable

Load and Unload Utilities: BTEQ, Fast Export, Fast Load, Multi Load, Tpump, Tdch and Teradata Parallel Transporter

ETL and Reporting Tools: COGNOS, Informatica Power Center 9.1/8.6.0/8.1 , Toad, Hive, Quality Center, Erwin, SSIS, Golden Gate and Tableau

Languages: SQL, C, C++, basic Java, Unix Shell Scripts and Perl Scripts

Backup Tools: DSA/DSC, Teradata ARC, Tara, ABU, Net Vault, Net backup and Data Domain

PROFESSIONAL EXPERIENCE

Confidential, Tampa, FL

Teradata/Hadoop Database Architect

Responsibilities:

  • Lead the Big data technical operations team in Confidential Wireless, environment to make sure that the customers receives the necessary technical support for their business requirements.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Analyze and define client’s business strategy and determine system architecture requirements to achieve business goals.
  • Formulate strategic plans for component development to sustain future project objectives.
  • Gather detailed business requirements and use cases, and translate technical specifications into product requirements.
  • Create team strategies and establish project scopes of work; communicate project deliverable timeframes and benchmarks to clients.
  • Develop data architecture design to facilitate targeted customer analysis.
  • Suggest architectural improvements, design and integration solutions, and formulate methodologies to optimize object-oriented software and database development.
  • Daily maintenance of the clusters including keeping all the Big data related services up and running on the Teradata/Hadoop cluster.
  • Organize end-user training and problem analysis for server, desktop, and IT infrastructure work.
  • Diagnose and repair Unix and Windows processing errors, and implement solutions to improve efficiency.
  • Write and implement new server, middleware, and database standards.
  • Co-ordinate with the off-shore team in implementation of operations.
  • Evaluating the new Database services and Software Product Requirements, doing POC’s on the same to determine how it will help the end user.
  • Reviewing the projects in all the phases, Design, development, testing and migrations.
  • Expertise in high availability support for Teradata to keep the business up and running 24X7.
  • Custom report creation for customer engagements and delivery of projects on time.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Accordingly plan, implement and support any major upgrades that Confidential . Would come up in Teradata Technology.
  • Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, MongoDB, Accumulo, Tez, Spark, Storm, and Hadoop ETL development via tools such as Informatica.
  • Capacity planning and projecting the future demands.
  • Installing and maintaining TWIM configuring them as needed.
  • Viewpoint setup and monitoring the system using viewpoint portlets.
  • Analysis of the requirements and analyzing the current System, Designing the System based on requirement.
  • Automated Reviewing and migrating development code.
  • Developed several customizations of Teradata, which helps in better administration.
  • Automated stats collection process using view point to reduce the overall resource usage.
  • Tuning the reporting queries and ETL queries using the optimization techniques.
  • Prepared and automated scripts for monitoring the system and alert mechanisms.
  • Refreshed the data by using ARC, fast export, multi load and fast load utilities.
  • Responsible for end to end support (design phase to post implementation) for development projects.
  • Maintain the standards based on System level Benchmark analysis.
  • Worked on TD up-gradation, expansion and migration projects.
  • Maintain the viewpoint, Teradata manager servers.
  • Working on Incidents with Teradata GSC.
  • Created UNIX scripting for cluster backup and recovery.
  • Maintenance activities like Check Table, Scandisk using Ferret Utilities
  • Table level post Implementation analysis using DBQL data and implementing the primary, secondary, PPI, Indexes and Stats changes.
  • Generating and analyzing the performance reports at system level as well as application levels for Prod and non-Prod systems and working towards fixing the bottle necks by coordinating with Application team(s).
  • Recommending the better ways of writing the queries to the users with tuning tips and educating them on the best practices.
  • Capacity management and MVC analysis.
  • Analyze, design and development of Business Intelligence solutions for AT&T using prior Experience in Big data like Teradata, Vertica, Hadoop MapReduce, PIG, HIVE, Hbase, SQOOP, Spark and Programming languages like Java/SCALA/Python and Shell scripting. Support the data insights and data profiling.
  • Automate and integrate new scripts and software solutions to increase speed of data analytics and visualization using SQL, Shell scripting and Python.
  • Involves in Extracting and analyzing data from specific applications, systems and/or databases using Teradata, Hadoop and Spark to curate and create reports to provide metrics and recommendations based on the analysis of the data.
  • Fetch, clean, transform and loading the data from different sources like Relational databases, Amazon S3 buckets, Google Cloud Platforms and server logs from different servers, vendor specific API's and custom scripts to Hadoop/Teradata distributed file systems/Data lake.
  • Analyze system failures, identifying root causes, and recommended course of actions.
  • Analyze information and evaluating results using logic to choose the best solution and solve the issues regarding the applications
  • Implement advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark and Scala.
  • Develop the code base and write unit test cases to verify all the use cases according to the business requirements and modify the applications to optimize and improve the performance.
  • GoldenGate replication solutions design, implementation, and support for both unidirectional and bidirectional replication leveraging multiple features within the Oracle GoldenGate product suite.
  • Working with data delivery teams to setup new Hadoop users. includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Vendor escalation.
  • Configured and managed public/private cloud infrastructures using Amazon Web Services (AWS) like VPC, EC2, S3, Cloud Front, ELB and Elastic Beanstalk.
  • Designed multiple EC2 instances to attain high availability and fault tolerance using Auto Scaling, Elastic Load Balance.
  • Configured security parameters by managing AWS users and groups using AWS Identity and Access Management (IAM) to specify the policies & roles to allow or deny access to AWS resources.
  • Used Cloud Front to deliver content from AWS edge locations to users, allowing for further reduction of load on front-end servers.
  • Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
  • Created detailed AWS Security groups which behaves as virtual firewalls that controls the traffic allowed to reach one or more AWS EC2 instances.
  • Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef with Written Recipes and Ruby scripts to customize the Chef as per our environment and maintaining Infrastructure automation using Chef.
  • Migrated media (images and videos) to S3 and used Cloud Front to distribute content with low latency and at high data transfer rates.
  • Utilized CloudWatch and LogicMonitor to monitor resources such as EC2, EBS, ELB, RDS, and S3 etc.
  • Moving sample data and production data demographics to development databases using TSET.
  • Used TWIM for defining the workloads for ETL’s and Reporting.
  • Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions.
  • Advise software development teams on architecting and designing web interfaces and infrastructures that safely and efficiently power the cloud environment.

Confidential, LA

Teradata/Hadoop Database Architect

Responsibilities:

  • Lead the Big data technical operations team in Confidential, environment to make sure that the customers receives the necessary technical support for their business requirements.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Analyze and define client’s business strategy and determine system architecture requirements to achieve business goals.
  • Formulate strategic plans for component development to sustain future project objectives.
  • Gather detailed business requirements and use cases, and translate technical specifications into product requirements.
  • Create team strategies and establish project scopes of work, communicate project deliverable timeframes and benchmarks to clients.
  • Develop data architecture design to facilitate targeted customer analysis.
  • Suggest architectural improvements, design and integration solutions, and formulate methodologies to optimize object-oriented software and database development.
  • Daily maintenance of the clusters including keeping all the Hadoop related services up and running on the Teradata/Hadoop cluster.
  • Organize end-user training and problem analysis for server, desktop, and IT infrastructure work.
  • Diagnose and repair UNIX and Windows processing errors, and implement solutions to improve efficiency.
  • Write and implement new server, middleware, and database standards.
  • Co-ordinate with the offshore team in implementation of operations.
  • Evaluating the new Database services and Software Product Requirements, doing POC’s on the same to determine how it will help the end user.
  • Reviewing the projects in all the phases, Design, development, testing and migrations.
  • Expertise in high availability support for Teradata to keep the business up and running 24X7.
  • Custom report creation for customer engagements and delivery of projects on time.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Accordingly plan, implement and support any major upgrades that Confidential . Would come up in Teradata Technology.
  • Capacity planning and projecting the future demands.
  • Installing and maintaining TASAM configuring them as needed.
  • Viewpoint setup and monitoring the system using viewpoint portlets.
  • Implemented the High availability/Disaster Recovery solutions for Teradata Database and Viewpoint.
  • Analysis of the requirements and analyzing the current System, Designing the System based on requirement.
  • Automated Reviewing and migrating development code.
  • Developed several customizations of Teradata, which helps in better administration.
  • Automated stats collection process using viewpoint to reduce the overall resource usage.
  • Tuning the reporting queries and ETL queries using the optimization techniques.
  • Prepared and automated scripts for monitoring the system and alert mechanisms.
  • Refreshed the data by using DSA/ARC, fast export, multi load and fast load utilities.
  • Responsible for end-to-end support (design phase to post implementation) for development projects.
  • Maintain the standards based on System level Benchmark analysis.
  • Worked on TD up-gradation, expansion and migration projects.
  • Maintain the viewpoint, Teradata manager servers.
  • Working on Incidents with Teradata GSC.
  • Created UNIX scripting for cluster backup and recovery.
  • Maintenance activities like Check Table, Scandisk using Ferret Utilities
  • Table level post Implementation analysis using DBQL data and implementing the primary, secondary, PPI, Indexes and Stats changes.
  • Generating and analyzing the performance reports at system level as well as application levels for Prod and non-Prod systems and working towards fixing the bottlenecks by coordinating with Application team(s).
  • Recommending the better ways of writing the queries to the users with tuning tips and educating them on the best practices.
  • Capacity management and MVC analysis.
  • Moving sample data and production data demographics to development databases using TSET.
  • Used TWIM for defining the workloads for ETL’s and Reporting.
  • Golden Gate replication solutions design, implementation, and support for both unidirectional and bidirectional replication leveraging multiple features within the Oracle Golden Gate product suite.
  • Working with data delivery teams to setup new Hadoop users. includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users.
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
  • Performance tuning of Hadoop clusters and Hadoop Map Reduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Vendor escalation.
  • Configured and managed public/private cloud infrastructures using Amazon Web Services (AWS) like VPC, EC2, S3, Cloud Front, ELB and Elastic Beanstalk.
  • Designed multiple EC2 instances to attain high availability and fault tolerance using Auto Scaling, Elastic Load Balance.
  • Configured security parameters by managing AWS users and groups using AWS Identity and Access Management (IAM) to specify the policies & roles to allow or deny access to AWS resources.
  • Used Cloud Front to deliver content from AWS edge locations to users, allowing for further reduction of load on front-end servers.
  • Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
  • Created detailed AWS Security groups, which behaves as virtual firewalls that controls the traffic allowed to reach one or more AWS EC2 instances.
  • Implemented rapid-provisioning and life cycle management for Ubuntu Linux using Amazon EC2, Chef with Written Recipes and Ruby scripts to customize the Chef as per our environment and maintaining Infrastructure automation using Chef.
  • Migrated media (images and videos) to S3 and used Cloud Front to distribute content with low latency and at high data transfer rates.
  • Utilized Cloud Watch and Logic Monitor to monitor resources such as EC2, EBS, ELB, RDS, and S3 etc.
  • Conduct systems design, feasibility, cost studies, and recommend cost-effective cloud solutions.
  • Advise software development teams on architecting and designing web interfaces and infrastructures that safely and efficiently power the cloud environment.

Confidential

Teradata/Hadoop Database Architect

Responsibilities:

  • Lead the Teradata and Hadoop technical operations team in Confidential Technology, Inc. environment to make sure that the customers receives the necessary technical support for their business, fab and manufacturing requirements.
  • Led team to plan, design, and implement applications and software.
  • Collaborated with business analysts, developers, and technical support teams to define project requirements and specifications.
  • Designed, developed, and managed web-based applications, databases, network accounts, and programs.
  • Launched complex recovery solutions to safeguard mission critical data.
  • Accordingly plan, implement and support any major upgrades that Confidential Technology Inc. would come up in Teradata Technology.
  • Daily maintenance of the clusters including keeping all the Teradata and Hadoop related services up and running on the Teradata cluster.
  • Evaluating the new Teradata services and Software Product Requirements, doing POC’s on the same to determine how it will help the end user.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Implemented the Query Grid, Unity and DSA/DSC.
  • Implemented the Ldap, Keberos and hardening on Teradata cluster.
  • Implemented the Account provisioning using Active directory on Teradata and Hadoop.
  • Designed and implemented the system migration plan for 2750 to 2800 and 2800 to Intelliflex.
  • Co-ordinate with the off-shore team in implementation of operations.
  • Reviewing the projects in all the phases, Design, development, testing and migrations.
  • Expertise in high availability support for Teradata to keep the business up and running 24X7.
  • Implemented the High availability/Disaster Recovery solutions for Teradata Database and Viewpoint.
  • Custom report creation for customer engagements and delivery of projects on time.
  • Implementing Logical and Physical design of Databases, Data Modeling, Conceptual Design and Data Architecture.
  • Worked with BI Architects to develop and use new standards of development.
  • Capacity planning and projecting the future demands.
  • Installed Teradata utilities like TDODBC, TPT, load unload utilities on different platforms like Unix, Linux and Windows.
  • Installing and maintaining TWIM configuring them as needed.
  • Installation and build the PDCR reporting services.
  • Installed and configured DataLab’s.
  • Viewpoint setup and monitoring the system using viewpoint portlets.
  • Analysis of the requirements and analyzing the current System, Designing the System based on requirement.
  • Workload Management using TWIM and PS.
  • Automated Reviewing and migrating development code.
  • Developed several customizations of Teradata, which helps in better administration.
  • Automated stats collection process using view point to reduce the overall resource usage.
  • Creating and managing user accounts, databases and data.
  • Tuning the reporting queries and ETL queries using the optimization techniques.
  • Prepared and automated scripts for monitoring the system and alert mechanisms.
  • Refreshed the data by using DSA, ARC, fast export, multi load and fast load utilities.
  • Customized and automated many maintenance jobs using UNIX scripts.
  • Worked on TTU and DBMS upgrade activities like planning, testing and coordinating with various teams.
  • Responsible for end to end support (design phase to post implementation) for development projects.
  • Maintain the standards based on System level Benchmark analysis.
  • Worked on TD up-gradation, expansion and migration projects.
  • Maintain the viewpoint, Teradata manager servers.
  • Working on Incidents with Teradata GSC.
  • Created UNIX scripts to create user, databases as per standards.
  • Created and automated UNIX scripts to remove duplicate and extra privileges from user and roles.
  • Created UNIX scripting for cluster backup and recovery.
  • Maintenance activities like Check Table, Scandisk, Pack disk using Ferret Utilities
  • Table level post Implementation analysis using DBQL data and implementing the primary, secondary, PPI, Indexes and Stats changes.
  • Generating and analyzing the performance reports at system level as well as application levels for Prod and non-Prod systems and working towards fixing the bottle necks by coordinating with Application team(s).
  • Recommending the better ways of writing the queries to the users with tuning tips and educating them on the best practices.
  • Monitoring the Net vault backup jobs and restore the data as required.
  • Managed Data Domain services and implemented the monitoring and scheduled cleanup process.
  • Capacity management and MVC analysis.
  • Designed and developed the system capacity and KPI’s using Tableau dash board for Teradata and Hadoop.
  • Moving sample data and production data demographics to development databases using TSET.
  • Used TWIM for defining the workloads for ETL’s and Reporting.
  • Working with data delivery teams to setup new Hadoop users. includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users.
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
  • Performance tuning of Hadoop clusters and Hadoop Map Reduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Vendor escalation

Confidential, Denver, CO

Teradata/Hadoop Database Architect

Responsibilities:

  • Installing and implementing CPPT, DBQL and other DBA metadata jobs.
  • Worked on TD up-gradation, expansion and migration projects.
  • Maintain the viewpoint, Teradata manager servers.
  • Maintain the standards based on System level Benchmark analysis.
  • Workload Management using TASM, TWIM and PS.
  • Automated Reviewing and migrating development code.
  • Configuring the alert policies in TMGR and help in system monitoring.
  • Developed several customizations of Teradata, which helps in better administration.
  • Automated stats collection process reducing the overall resource usage.
  • Creating and managing user accounts, databases and data.
  • Reviewing the projects in all the phases, Design, development, testing and migrations.
  • Tuning the reporting queries and ETL queries using the optimization techniques.
  • Prepared and automated scripts for monitoring the system and alert mechanisms.
  • Installed Teradata utilities like TDODBC, TPT, load unload utilities on different platforms like Unix, Linux and Windows.
  • Refreshed the data by using ARC, fast export, multi load and fast load utilities.
  • Viewpoint setup and monitoring the system using viewpoint portlets.
  • Customized and automated many maintenance jobs using Unix scripts.
  • Installing and maintaining TWIM/TASM, configuring them as needed.
  • Worked with BI Architects to develop and use new standards of development.
  • Capacity planning and projecting the future demands.
  • Worked on TTU and DBMS upgrade activities like planning, testing and coordinating with various teams.
  • Responsible for end to end support (design phase to post implementation) for development projects.
  • Working on Incidents with Teradata GSC.
  • Created UNIX scripts to create user, databases as per standards.
  • Created and automated UNIX scripts to remove duplicate and extra privileges from user and roles.
  • Created UNIX scripting for cluster backup and recovery.
  • Maintenance activities like Check Table, Scandisk, Pack disk using Ferret Utilities
  • Table level post Implementation analysis using DBQL data and implementing the primary, secondary, PPI, Indexes and Stats changes.
  • Used ER win Tool in Data Modeling.
  • Generating and analyzing the performance reports at system level as well as application levels for Prod and non-Prod systems and working towards fixing the bottle necks by coordinating with and Application team(s).
  • Recommending the better ways of writing the queries to the users with tuning tips and educating them on the best practices.
  • Monitoring the Net vault backup jobs and restore the data as required.
  • Capacity management and MVC analysis.
  • Moving sample data and production data demographics to development databases using TSET.
  • Used TDWM for defining the workloads for ETL’s and Reporting.
  • Upgrading system from TD13 to TD14.10.
  • Migrating Nodes from 5500 to 2750.
  • Backup setup using Tara.
  • NetBackup implementation.
  • Working with data delivery teams to setup new Hadoop users. includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users.
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
  • Performance tuning of Hadoop clusters and Hadoop Map Reduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Point of Contact for Vendor escalation

Confidential

Senior Teradata DBA

Responsibilities:

  • Migrated data from SQL server to Teradata.
  • Workload Management using TWIM and PS.
  • Installed and configured Net backup and ABU.
  • Viewpoint setup and monitoring the system using viewpoint portlets.
  • Customized and automated many maintenance jobs using Unix scripts.
  • Installing and maintaining TWIM configuring them as needed.
  • Automated Reviewing and migrating development code.
  • Configuring the alert policies in TMGR and help in system monitoring.
  • Developed several customizations of Teradata, which helps in better administration.
  • Automated stats collection process reducing the overall resource usage.
  • Creating and managing user accounts, databases and data.
  • Reviewing the projects in all the phases, Design, development, testing and migrations.
  • Working on Incidents with Teradata GSC.
  • Created UNIX scripts to create user, databases as per standards.
  • Created and automated UNIX scripts to remove duplicate and extra privileges from user and roles.
  • Created UNIX scripting for cluster backup and recovery.
  • Maintenance activities like Check Table, Scandisk, Pack disk using Ferret Utilities
  • Table level post Implementation analysis using DBQL data and implementing the primary, secondary, PPI, Indexes and Stats changes.
  • Tuning the reporting queries and ETL queries using the optimization techniques.
  • Prepared and automated scripts for monitoring the system and alert mechanisms.
  • Installed Teradata utilities like TDODBC, TPT, load unload utilities on different platforms like Unix, Linux and Windows.
  • Refreshed the data by using ARC, fast export, multi load and fast load utilities.
  • Worked with BI Architects to develop and use new standards of development.
  • Capacity planning and projecting the future demands.
  • Installing and implementing CPPT, PDCR and other DBA metadata jobs.
  • Worked on TTU and DBMS upgrade activities like planning, testing and coordinating with various teams.
  • Responsible for end to end support (design phase to post implementation) for development projects.

Confidential

Java Developer

Responsibilities:

  • Gathered requirements from analysts and understand client’s requirements.
  • Involved in the detail design of the modules according to the J2EE standards.
  • Designing the application, implementation the design, testing and maintenance support.
  • Involved in all phases of software development Life Cycle including requirement gathering,
  • Involved in the complete development process for these modules with Struts 1.2.
  • Developed Web-Service client with Axis 1.2 using SOAP.
  • Implemented Java Script for client side validations.
  • Implemented JMS for External System Asynchronous Transactions.
  • Data Modeling, writing stored procedures and SQL, PL/SQL queries in Oracle 10g
  • Developed Proof of Concepts and provided work/time estimates for design and development efforts.
  • Coordinated with the QA lead for development of test plan, test cases, actual testing and actively participated in resolving the defects.
  • Analyzing business requirements, installing and upgrade application, domain and database from web logic portal 8.1.5 to web logic portal 10.2.
  • JDK 1.5 and JSP 2.0 compatibility.

We'd love your feedback!