We provide IT Staff Augmentation Services!

Teradata Administrator Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Powering business success by fusing technology with business strategy
  • Technically sophisticated and business savvy who can play with data having 9 years of experience and documented history of bridging technical and management acumen in turning around of information technology.
  • Pioneering solid experience in Big Data, Datawarehouse, Enterprise Datalake (EDL) integration, migration projects , building huge Big data pipelines, batch jobs, integrated different sources into one , Migrating existing project to Hadoop, architecting big data projects in hadoop , data warehouse , Performance optimization, real time big data ingestion and big data analysis .
  • Deft at meeting new technical challenges and finding solutions to meet customer needs rich analytical knowledge along with agile , new design for big data/Hadoop/Datawarehouse/EDL.
  • Excel in collaborating with project teams , interfacing with multiple stakeholders and deploying technology to build successful IT solutions for clients.
  • Quick to assimilate new ideas, concepts and cutting - edge technologies while demonstrating a logical and analytical approach to solve complex problems and issues.
  • Running Blog datadosa.com on cloud AWS (Self Initiative to teach globally) .
  • Managed team (size 10 to 20 ) and datawarehouse or Bigdata projects for more than 5+ years.
  • Conduct ETL (Teradata, Informatica, Datastage, SSIS), SQL and DB performance tuning, troubleshooting, support, and capacity estimation to ensure highest data quality standards
  • Industry experience with Bigdata technology (Hadoop, Hive, Spark, SQL, Kafka, Hadoop, Python, Scala etc.)
  • Experience with real time data pipeline and architecture
  • Conduct dimensional modelling, master data management, metadata management, data cleaning and warehouse querying
  • Sound knowledge of Agile development (Scrum and Kanban), Waterfall and SAFe methodology (JIRA, Confluence and HP quality Centre) and best practices (code reviews, testing, etc.) to develop and deliver data products
  • Experience handling unstructured data and building data pipelines
  • Experience with Data modelling/Data Vault, Enterprise warehousing experience (Erwin tool), EDL (Data Lake)
  • Strong proficiency with relational databases (Oracle, DB2, SQL, Teradata, HIVE etc.) and reading and writing SQL and implementing data pipelines to deal with incremental data size 100 TB with database tuning (Indexing strategies, partitioning)
  • Strong understanding of Data Governance and Master Data Management and principles
  • Experience in handling data visualization tools like Business Objects, Tableau
  • Operating systems like Unix, Windows, Mainframe
  • Datamodel experience in Finance (FSLDM), Transportation (TLDM) and understanding of CLDM
  • Experience in models like Star Schema, Snowflake Schema, kimball, Inmon, OLTP,OLAP
  • 9+ years with a strong specialization in Teradata development ( TPT,MLOAD, FASTEXPORT, FASTLOAD / optimization ), distributed data storage systems with its ecosystem and worked as Teradata Administrator too
  • Exposure to cloud tools like IPASS, AWS

EXPERTISE AREA:

  • SAFe, JIRA, Confluence
  • Agile & Devops Methodology
  • ETL Informatica,SSIS tools
  • Datastage certified
  • Team Management
  • IT Strategy and Planning
  • Hadoop/Cloudera/Hortonwork
  • Big data Migration/Architect
  • Cross-functional Coordination
  • Datawarehouse designer

TECHNICAL SKILLS:

Database: Teradata 14.0, Oracle, Hive,Dimension Modelling, Snowflake, Star Schema,ETL Tester ETL

ELT and cloud tools: Informatica, Datastage, Teradata (Mload,Fastexport,TPT,TPump,Fastload), Ipass, SSIS Trained in Hadoop Ecosystem Mapreduce, Kafka, Spark, Hive, Sqoop, Hadoop, Spark, Machine Learning,Master Data Management.

Master Data Management (MDM) Business Intelligence and Reports: Business objects, Tableau Process Agile, Scrum Master, JIRA,Devops, HP Quality Centre,SDLC,Waterfall,Erwin Tool Script Unix Shell Scripting, VB scripting Concepts Performance Tuning, Indexing, Optimization,Data Lake,Data Mart,Data Vault

Domain Experience: Finance, Transportation, Hospitality, Banking

CAREER PROGRESSION:

Confidential

Teradata Administrator

Responsibilities:

  • Demonstrated capability by implementing real time streaming through Spark streaming (coded in scala) on twitter by finding out trending Hashtags
  • Prepared Sales Repository for big data ecosystem and coded logic to implement surrogate key generation, SCD-2 .
  • Played an important role in doing datawarehouse offloading to Big data ecosystem
  • Mentoring juniors and given training to the teams on big data technologies
  • Interacting with customers & team for requirement gathering, risk assessment, and finalization of Architectural/Functional design
  • Participated in technical design and coding of Software Applications, mapping requirements, and in the finalization of product specifications and selection of appropriate techniques
  • Responsible for overall execution of projects, ensuring quality of deliverables & productivity improvements
  • Developing plans & schedules, resource allocations, manpower deployment, and team meetings for individual projects, also worked with 3rd party vendors
  • Involved in hiring and inducting the right talent & forming right teams for development
  • Responsible for creating and maintaining design and support documentations
  • Re-engineering of the systems to adapt to the GDPR for the European market.
  • Implemented across different data models to produce single version of truth in different denormalized tables .
  • Complete development was done under my leadership.
  • Gather and define business requirements while managing the risks to improve business processes, thereby contributing to enterprise architecture development from a business needs point of view through business analysis and map processes
  • Define the business mission and performance standards across all functional areas and periodically review performance with the deft application of concurrent management audit procedures
  • Organize various training sessions for the team to enhance their performance and train them on hadoop
  • Ensure technical solutions are designed for performance, reliability, scalability, maintainability, supportability, business continuity and business agility while leveraging industry’s best practices
  • Deftly serve as ‘Single Point of Contact/Interface’ for supporting clients
  • Conduct ‘SWOT’ analysis and utilize findings for designing customized strategies to enhance customer service

Confidential

Teradata Administrator

Responsibilities:

  • Structured project proposals complete with details of activities, time frame, and required mix of resources. Made business presentations before the clients to generate value proposition and secure financial commitments. Worked as administrator for carrying out all kind of migration and for role, space and security implementation
  • Excelled as tech leader while managing multiple projects ensuring successful completion of the projects and smooth execution and implementation of the projects
  • Dealt with various technical aspects of the projects including analysis of project requirements, technical guidance, estimation, scheduling, and final delivery of the solutions while focusing on competence enhancement activities
  • Collaborated with the team members and senior management to maintain a continuous stream of information regarding the project status and progress
  • Catalyzed business growth with constant impetus of strategic initiatives across diverse functional domains
  • Efficiently furnished guidance on the projects and its requirements to the clients over the technology, processes and applications while updating them on the regular project related developments
  • Actively involved in preparing estimates for product testing activities, developing plans for testing and UAT while maintaining the resource matrix for task allocations

Confidential

Teradata Administrator

Responsibilities:

  • Predictive analysis to find out sector/country/region wise activities by bankers in forthcoming quarters and years
  • Predictive analysis report to find out chances of having dealt with clients based on bankers’ activities
  • Collaborate with developers, project managers, business analysts and business users in conceptualizing and developing data marts and enhancements. Deployment of code in testing and pre prod regions with maintaining different versions of code in clearcase.
  • Implementation of new logic which will be delivering new files considering liquidity premium index. Requirement and feasibility analysis, design, documentation, development, performance testing, production implementation. Design and development of database structures and logic. Deployment of functional packages
  • Worked on a module which would be generating create table statement and where clauses dynamically. Its result set is used in making joins between fact table and dimension tables. Dynamically generating create table statement based on data point id and report mapping. “Where” clause generation dynamically. Proposed a recursive function which would be handling huge data for dynamically generating SQLS which will be used for joining purpose between fact and dimension tables
  • Dexterously managed the development, execution, updating and reporting of project plans and schedules
  • Coordinated significantly in various technical aspects of the projects, i.e. requirement analysis, proposal, design and development, quality, and defects monitoring
  • Prepared and reviewed the test plans, test cases and test reports while performing mutual testing the processing for any defects
  • Meticulously documented all major activities for effective reference and use
  • Keenly participated in preparing approach document for new projects
  • Assured both quality and customer service while managing advanced/complex development tasks and projects to successful completion
  • Synchronized successfully with clients for test environment setup and data capture, coordination with user groups for input, feedback, acceptance of renovation

We'd love your feedback!