Etl/big Data/aws Cloud Lead, Data Analyst Resume
4.00/5 (Submit Your Rating)
SUMMARY
- Senior Technical Leader with over 16+ years of experience, Team Player with a commitment to excel in opportunities, who possess a long track record of working in various roles in Information Technology, coupled with good analytical skills and the ability to communicate confidently in all levels.
TECHNICAL SKILLS
AWS : EC2, S3, VPC, RDS, SNS, SQS, Redshift, Cloud watch, Lambda
Datawarehouse : Teradata (On - Prem), AWS Redshift, Snowflake (Cloud)
ETL/DWH Tool : Ab-initio (Co>op 2.14, 3.1, 3.2), Informatica Power Center 8.6.1
Scripting Languages : UNIX Shell Scripting, Python
Database : Oracle 9i, Oracle 10g
NoSQL Database : DynamoDB
OS Familiarity : UNIX, Windows
Scheduling Tools: AROW Cloud scheduler, Control-M Scheduler, Tidal jobs, IBM Tivoli Maestro
PROFESSIONAL EXPERIENCE
ETL/Big Data/AWS Cloud Lead, Data Analyst
Confidntial
Responsibilities:
- Extensive Experience working in Ab Initio ETL tool and has good exposure to Generic graphs and also working on Continuous flows and Micro Graphs.
- Strong Data warehousing concepts .
- Extensive knowledge on Star and Snowflake schemas.
- Extensive experience in using scheduling tools like Control M.
- Expertise on SQL, Unix Shell Scripting and Python.
- Experience in working with EC2, EMR, Lambda and S3 bucket in AWS using Cloud Formation Template.
- Hands on experience in creating Continuous and Batch graphs, Identifying and resolving Performance bottlenecks in various levels and tune the Graphs based on its performance.
- Demonstrated ability to complete multiple assignments simultaneously and maintain high standards of client organizations.
- Ability to cope well in a high intensity and changing priority work environment.
- Excellent verbal / written communication and interpersonal skills
- A self-starter, team player, worked on fast-faced development environments committed to the deliverables.
- Excellent Analytical and Problem-solving skills.
Confidential
Senior Data Analyst/ Senior Data Engineer
Responsibilities:
- Design and maintain well-structured relational database schemas. Create automated ETL pipelines for a variety of raw manufacturing data sources. Research and develop new data storage architectures for growing data volumes
- Designing a resilient framework using Amazon Web Services , which involves read/streaming data from the producer/Source through Apache Kafka process to ingest the data to Client specific cloud storage location (S3) . Inhouse data validation tool is integrated to take care of data quality checks. Setting up notification service through Amazon SNS (Simple Notification Service) which in-turn triggers AWS Lambda function to pick the data from one S3 to Centralized S3 location.
- Data load framework is being used to load the data from S3 location to Cloud Snowflake Datawarehouse . Ensure appropriate services and effective design in a cost-effective method.
- Intake Business manager/product owner’s request on their strategies and future vision and turn those business requirements into data visualization products using Tableau Business Intelligence Data Visualization tool .
- Integrating Snowflake SQL within Python code and setting up Apache Air Flow scheduler to automate the business reports to be generated on regular basis (Daily/Weekly/Monthly) and push the output files to Amazon Storage region (S3).
Confidential
Cloud Lead/ETL Lead
Responsibilities:
- Analyzing the data lineage of the CARD related sources, process and targets and worked with solutions architect to laid a migration plan for the smooth transition from on-prem to cloud.
- Performing gap analysis on the data lineage result and propose/design a system to address the gap to help out on the timely migration activities.
- Designing the pre-production plan, framework to develop and integrate our cloud system.
- Parallel production environment design to perform reconciliation between on-prem and cloud data outputs. Mitigate the issues encountered and successfully implement it without major hurdles.
- Focused on building automations, frameworks, utilities and quality software to establish a factory model for migration to cloud
- Works with managers to provide feedback on people for performance reviews and career development purposes.
- Create technical designs and build POCs on ONPREM exit process for new efforts, validating a wild idea works before committing to it
- Work with a distributed team of engineers from all across the globe
- Be a contact point in Cloud for other teams on any help required on post production implementation.
- Be hands on with the codebase. Review work done in the team, and provide constructive feedback. Help the team define coding practices and standards