Technology Lead Resume
Alpharetta-gA
SUMMARY
- 10+ years’ experience in building enterprise data warehouse applications using Teradata, Informatica, Erwin Data Modelling, Bigdata Hadoop complete stack development with Google cloud applications development, building scalable and high - performance Big Data Analytical Systems with specialization in Big Query and Hadoop Platform.
- Dynamic Data Analytics leader with a successful track record of building Data Warehouses, Business Intelligence & Analytic Solutions that empowers companies to harness and monetize their data assets.
- Broad knowledge and perspective in data pipeline, data collection, data management, data engineering, reporting, analytics, and product/application development.
- Confident understanding of analytical and transactional databases with Teradata-EDW, Google Cloud & Hadoop systems.
- Effectively optimize the company’s strategic data initiatives and creating actionable insights for business user to make better business decisions.
- Dynamic Data Analytics leader with a successful track record of building Data Warehouses, Business Intelligence & Analytic Solutions that empowers companies to harness and monetize their data assets.
- Broad knowledge and perspective in data pipeline, data collection, data management, data engineering, reporting, analytics, and product/application development.
- Confident understanding of analytical and transactional databases with Teradata-EDW, Google Cloud & Hadoop systems.
- Processing different types of files csv, xml, json with different file formats such as ORC and other compressed file formats in Google Cloud and Big Query.
- Working to create requirements, assisting to prioritize work, and determining acceptance criteria for a development team based on payments industry knowledge and experience in the information technology space
- Troubleshooting and identifying gaps with existing systems/processes such as, but not limited to manual processes that can easily be automated, lack of appropriate tracking systems, security risks and vulnerabilities
- Assisting in evolving team process and driving continuous improvement based on experience in successful delivery in Agile teams.
- Release process for products the team develops based on company best practices and standards.
- Experience in Analysis and Design, Performance Tuning, Query Optimization, Stored procedures, functions, packages, triggers, views and indexes to implement the business logics of database in Teradata, Sql Server and loading data warehouse tables like dimensional, fact and aggregate tables using SSIS, Teradata Utilities.
- Experience in Relational Data Modeling and Dimensional Data Modeling, Star Schema Modeling, Physical and Logical Data Modeling, Erwin 7/7.3.
- Using data pipelines to extract, transform, and load data from OLTP relational database into BigQuery for analysis.
- Construct full spectrum of data pipelines from raw data to consumption and analysis.
- Good experience in converting Business Ideas to workable solutions and right problems by working closely with the Business teams.
- Extensive hands on experience in writing complex stored procedures using SQL.
- Experience in Data mining, Visualization and Business Intelligence tools such as Tableau.
- Collaborated with clients to create business-driven reports, performed data mining and analysis of using dynamic sets, filters and groups in Tableau. Created dashboards and user driven reports, thus improving the business.
- Experience in using Teradata Administrator, Teradata Manager, Teradata PMON, Teradata SQL Assistant and writing Teradata load/export scripts like BTEQ, Fast Load, Multi Load, TPUMP, TPT and Fast Export in UNIX/Windows environments.
- Good Experience in SQL, Teradata, Informatica, Oracle, Sql Server, Shell Scripting, Google Cloud, Big Query, Big-Table.
- Good Experience in implementing ETL logic in Informatica and Teradata.
- Developed Transformations, Mapping and Workflows using Informatica for processing Historical and Incremental loads.
- Developed ETL logic in Bteq, TPT and fast-loads for processing Historical and Incremental loads in Teradata.
- Good experience in writing Unix shell scripting.
TECHNICAL SKILLS
Cloud Technologies: Google Cloud - Big Query, GCS, GSUTIL, BQload
On-Prem Hadoop Distributions: Cloudera, Hortonworks
Big Data Technologies: Hadoop - Map Reduce, Sqoop, Hive, Hue, Oozie workflow
Databases: Oracle, Teradata, Microsoft SQL Server, My-Sql
Programming Languages: Python, UNIX Shell Scripting, SQL
Database Utilities: BTEQ, Fast Load, TPT, MLoad, Fast Export, TPump
ETL Tools: Informatica, Talend, Teradata Tools & Utilities, Teradata Sql Assistant
Reporting Tools: Tableau and Big Query
Build Tools: Jenkins, Puppet
Operating System: Windows, RedHat Linux, Centos Linux
Other Tools: Git-Hub, Tidal, Control-M
PROFESSIONAL EXPERIENCE
Confidential, Alpharetta-GA
Technology Lead
Responsibilities:
- Analyze existing data model from Teradata and make sure that no impact on reporting after migrating data to GCP
- Define GCP model such that extraction cost would be optimum
- Help to migrate existing data from Teradata environment to google cloud platform
- Make use of tools available in GCP like big query, Data studio, cloud dataproc, CI/CD, dataflow etc.
- Plan migration activity so that all subject areas would be covered
- Technical consultant for Data Management team. Data analysis, root cause and fixing data related issues in system for the business.
- Providing support in data mapping for BI / Analytics solution.
- Utilizing knowledge of functional area to link business problems and Data solutions.
- Converting business use cases into technology.
- Create, evaluate, and modernize business, operation models and analyzing data using SQL in MySQL, BigQuery
- Attended many meeting’s with Team members to determine requirements and create user stories
- Research and analyze trends, forecasts, past sales, and market value with Pivot tables & Graphs in Excel
- Performing ETL process using SQL Server, Google Cloud Storage & BigQuery.
- Depending on the use case and platform, Normalize and De-normalize the data in SQL server and BigQuery.
Confidential
IT Analyst
Responsibilities:
- Reverse-engineered a database and have worked on the data aspects of a system conversion and migration.
- Performance Tuning and Optimizing worst performing queries in Teradata production environment.
- Implementing ETL logic in Bteq and TPT scripts.
- Transforming real time business use cases into business models in Teradata and Bigdata Hadoop.
- Contributing to the future roadmaps and planning based on industry experience and insights.
- Leading the change management and release process for products the team develops based on ITIL and company best practices.
Confidential
Teradata ETL Developer
Responsibilities:
- Prepared ETL Scripts for Data acquisition and Transformation. Developed the various mappings using transformation like source qualifier, joiner, filter, router, Expression and lookup transformations etc. in Informatica.
- Creating conceptual, logical and physical database models for different metadata tables, views or related database structures using ERWIN.
- Coding using BTEQ SQL of TERADATA, Implementing ETL logic using Informatica, transferring files using SSH-Client.
- Populate or refresh Teradata tables using Fast load, Multi load &fast export utilities/scripts for user Acceptance testing and loading history data into Teradata.
- Experience in creating and writing Unix Shell Scripts (Korn Shell Scripting - KSH).
- Preparing test cases and performing Unit Testing and integration testing.
- Performance tuning the long running queries. Worked on complex queries to map the data as per the requirements.
- Production Implementation and Post Production Support.