We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Expert Software Professional with a history in all different areas of information technology.
  • Possessing extensive experience, ranging from software design, development, and integration to incorporating a wide range of applications and technologies.
  • Master of computer science and operation from over a decade of working with and on computers.
  • Developed communication, cooperation, and leadership skills from IT support career. Knowledge of creating databases, users, tables, triggers, macros, views, stored procedures, functions, packages, joins, and hash indexes in the Teradata database.
  • Extensive familiarity with SQL queries and stored procedures. A candidate ready to utilize years of experience in IT to excel in any position.
  • Always willing to go the extra mile to make sure performance is of high quality; fully invested in all projects, feels a personal sense of investment and pride as well as ownership towards all work
  • Possesses a formidable knowledge of how to design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs
  • Stays abreast of current technological developments; learning new languages and keeping grasp of DBA functions including capacity planning, installation, configuration, database design, migration, performance monitoring, security, troubleshooting, as well as backup and data recovery
  • Demonstrates flexibility and always open to discussions/ suggestions with teammates, and uncovering new information; welcoming and conversing all ideas and prioritizing based on data
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load etc. and hands - on experience using query tools like TOAD, SQL Developer, PLSQL Developer, Teradata SQL Assistant

TECHNICAL SKILLS

Databases: Teradata R14, Teradata R13 R12, v2R6, v2R5, Greenplum, Oracle, MS SQL Server and 2000

Programming Languages: Unix Shell Scripting (KSH), SQL DWH, SQL DB, SQL SERVER, Python

Teradata Tools and Utilities: Fast Load, Fast Export, Multiload, Tpump, TPT, Teradata SQL Assistant, BTEQ

Cloud Technologies: AZURE Data Factory, Azure Blob Storage, Azure Data Pipelines, Amazon S3, Amazon IAM

Operating Systems: Windows and Linux

Versioning Tools: Tortoise SVN, GitHub

Tools: Tivoli workload scheduler, Autosys Scheduler, Control-M

ETL & BI Tools: Ab Initio, Informatica PowerCenter, MicroStrategy and Business Objects, Mainframes, Tableau

PROFESSIONAL EXPERIENCE

Sr. ETL Developer/Azure Data Engineer

Confidential

Responsibilities:

  • Serving as Sr. ETL Developer/Azure Data Engineer on a project to migrate the Teradata and data stage jobs to Azure Cloud and to comprehend the Casino Management System, Point of Sales in Casinos, Hotel Management, And Slot Daily Systems
  • Led a team towards the successful implementation of the new project that provides benefits to the casino and hotel management establishments
  • Successfully designed and developed data ingestion pipelines in Azure, and Azure Data lake, orchestrating data pipelines to store all the data in Data Lake as a single source of truth
  • Independently converted the BTEQ scripts to store procedure in SQL DWH, and designed and developed ETL & ETL frameworks using Azure Data Factory
  • Enhanced functionality and reliability of the overall data infrastructure by recreating existing application logic and functionality in the Azure Data Flows, and SQL Database environment
  • Transformed all the raw data from relational, non-relational and other storage systems, and integrated it for use with data-driven workflows to help map strategies, attain goals and drive business value from the data possessed
  • Execute unit tests and validate expected results; iterating until test conditions have passed
  • Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production
  • Provide support for all production issues for the migration project
  • Assess opportunities for application and process improvement and prepare documentation of rationale to share with team members
  • Adhere to high-quality development principles while delivering solutions on-time and

ETL Developer/Program Analyst

Confidential

Responsibilities:

  • Part of a dynamic team working on a project with the main objective to design and create Risk- Free Data warehouse with new Call Traffic Data by integrating the data from different sources and making it available in EDW for the downstream reporting purpose
  • Taught, motivated and mentored others; translating the ideas of business leaders and end-users into technical requirements effectively leading to better understanding and implementation
  • Successfully created Teradata Mload, Bteq and Fast Export scripts by using Unix Shell scripting extensively to load, transform and extract data
  • Involved in migrating the Code from Informatica 8.1 to 9.6. as well as conducting Informatica PowerCenter, Informatica Power Exchange, and Teradata Training
  • Collaborated with other members of the team to get information about business requirements and end-user needs in designing and delivering correct, high-quality data
  • Designed, implemented, and continuously expanded data pipelines by performing extraction, transformation, and loading activities
  • Maintained and improved already existing processes significantly, ensuring the data architecture is scalable and maintainable
  • Acknowledged by Senior Managers and Business clients for successfully executing phase 1 of the project
  • Participated in the development of process improvement techniques across development functions, as the team embarked on attaining CMMI level 1 credentials

Environment: Teradata, Teradata Utilities (Mload, Bteq), Informatica Tool, Linux scripts, Autosys Scheduler

Data Analyst

Confidential

Responsibilities:

  • Conducted analysis, developed information systems, and generated accurate and comprehensive reports on operational data
  • Collected, calculated, and checked information from various data sources to devise solutions, design systems, and implement information management processes
  • Cooperated and communicated well with other personnel across departments to explain and assist in the integration of information management and data communication systems
  • Recommended other methods and technology derived from the gathered data to maximize the efficiency of project implementation

We'd love your feedback!