We provide IT Staff Augmentation Services!

Aws Solution Architect/data Architect/data Engineer/data Analyst/etl Lead Resume

5.00/5 (Submit Your Rating)

Detroit, MI

SUMMARY

  • I bring more than 12+ years of industry experience in end to end solutions encompassing design, develop, lead, test, manage and deliver various assignments wif warranty of distinct undertakings wifin Automotive and Telecommunication industry sectors and other verticals. Building business rapport and be a channel to them for understanding, prioritizing, developing and delivering business requirements and build successes out of it has been my strength. Has a proven track record of successful lead assignments domestic and overseas driving a group of people to achieve a common goal and be outstanding on the deliverables.
  • Business coordination, requirements gathering, development in Cloud, BigData, Datawarehouse and Business Intelligence related applications.
  • Experience in leading and managing teams. Handled multiple roles - Lead, SME, Consultant, Developer, Data Specialist and Stewardship.
  • Subject Matter Expert (SME) for several applications across multiple portfolios.
  • Be proactive in chasing deliverables under tight deadlines and multitude of challenging complexities in Agile/SCRUM environment.
  • Thorough process knowledge of Incident management, Problem management and Change management.
  • Try and transform complexities into manageable tasks in a challenging offshore-onsite work flow model culminating into a smooth delivery team.
  • Enabling each individual performer of the team to realize his/her fullest potential by help face challenging tasks. Mentor, guide and motivate to achieve successes.
  • Cross training team members to work on group of projects in parallel to achieve maximum efficiency.
  • Continuous business communication resulting in maximum transparency in work prioritization and delivery.
  • Mentoring the offshore team to new heights of business satisfaction, to be able to communicate, support, deliver to the businesses wif minimum to no effort or help from the lead team members.
  • Being proactive in recognizing new business opportunities across several cross functional areas and teams to become more profitable as a company.
  • Support and help to other areas of application management support teams when needed.

TECHNICAL SKILLS

OS Platforms: Linux, Windows, Sun Solaris

Cloud AWS: Informatica Cloud Services, Salesforce

AWS: EC2, VPC, RDS, S3, IAM, CloudFront, CloudWatch, R53, SNS, SQS, ELB, Lambda, CLI, ECS

DevOps: Git, Dockers, Kubernetes, JIRA

Database: Oracle, MySQL, DB2, Teradata, Hive, Impala, DynamoDB, SAP Hana

ETL: ICRT, IICS, Informatica Big Data Edition /Power Center/IDQ

Reporting Tool: Business Objects, Tableau

Data Modeling Tool: ERWIN

Scheduling Tools: IBM Tivoli Workload Manager, Autosys, Control-M

Framework: Cloudera HDFSBig Data Tools Hive, Impala, SPARK, Kafka

Language: Shell, Python, Perl, Core Java, Oracle PL/SQL

PROFESSIONAL EXPERIENCE

Confidential, Detroit, MI

AWS Solution Architect/Data Architect/Data Engineer/Data Analyst/ETL Lead

Responsibilities:

  • Designing and implementation of public and private facing websites on AWS Cloud.
  • Migrating from On-Premise Infrastructure to AWS Cloud.
  • Design for high availability and business continuity using self-healing-based architectures, fail-over routing policies, multi-AZ deployment of EC2 instances, ELB health checks, Auto Scaling and other disaster recovery models.
  • Configured and managed various AWS Services including EC2, RDS, VPC, S3, Cloud Watch, Cloud Front and R53 etc.
  • Reduced build and deployment times by designing and implementing Docker workflow. Build and maintained docker container clusters managed by Kubernetes, utilized Kubernetes and docker for the runtime environment of the CI/CD system to build, test and deploy.
  • Created pipelines for deploying code from GitHub to Kubernetes (K8s) cluster in the form of Docker containers using Spinnaker platform.
  • Used Bash and Python, included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as Encrypting EBS volumes, backing AMIs and scheduling Lambda functions for routine AWS tasks.
  • Configured various performance metrics using AWS Cloud Watch & Cloud Trial
  • Written various Lambda services for automating the functionality on the Cloud using Python.
  • Used AWS Route 53 for configuring the High-Availability and Disaster recovery to make the environment up and running in case of any unexpected disaster.
  • Worked on S3 data store formats and used Athena to read S3 objects.
  • Part of Migration process of our On-Prem Data warehouse to AWS Redshift using AWS Data Migration Service and Schema Conversion Tool
  • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
  • Create AMIs for mission critical production servers for backup.
  • Worked on infrastructure wif Docker containerization using the Kubernetes orchestration also AWS Step Functions to run the multiple AWS services.
  • Deployed and configured Git repositories wif branching, forks, tagging, and notifications.
  • Leveraged CloudWatch alarms baseline and centralized logging wif AWS landing zone
  • Integrated Informatica Big Data wif Kafka based Topic and Partition based point to point event communication.
  • Developed build using ANT and MAVEN as build tools and used Jenkins to kick off the builds move from one environment to other non-prod environments.
  • Developed baseline AWS account security, implemented/integrated end-point protection, vulnerability scanning and intelligent threat detection.
  • Setup AWS Single Sign On (SSO) for on premise Active Director (AD)
  • Implemented security best practices in AWS including multi factor authentication, access key rotation, encryption using KMS, firewalls- security groups and NACLs, S3 bucket policies and ACLs, mitigating DDOS attacks etc

Confidential Detroit, MI

Tech Lead

Responsibilities:

  • The Project is Integrating Salesforce Cloud to Big Data Datalake for the CRM Customer, Marketing Data.
  • Our Current project involves Agile methodology
  • Business Meetings to gather requirements and coordinate wif Business Analyst and Data Architect.
  • Translate Business requirements to Technical design for Development team by providing data workflow models and design documents.
  • Estimate the Development efforts and assign the tasks to team members
  • Coordinate wif Offshore and Onsite Development team wif regular status meetings to track the Project timeline inorder to meet the deadline.
  • Conduct Design and Code Review and Knowledge Transition Sessions wif peer Production Support Team prior to GO-Live.
  • Actively involved in Project Quality Assurance testing phases.
  • Weekly Project progress updates to Business and Technical Managers
  • Mentoring and resolving any technical issues wif Development team
  • Proficient in Cloud, Big Data, Data Warehouse, Data Analytics BI tools.
  • Involved in Project life cycles from translating business requirements to technical requirements and creating technical design documents and test plans for the Operational and Datawarehouse objects along wif Code deployments, Maintenance and Support activities.
  • Involved in Product life cycles from POC, Installations, and Testing wif Admin activities.
  • Proficient in Big Data tools like Hadoop File System, HIVE, and Impala.
  • Worked on Cloud environments like AWS, Informatica Real Time and Salesforce.
  • Responsible for the requirement analysis, effort estimation, project planning, task allocation.
  • Involved in offshore team coordination for development activities and validating them for timeline delivery.
  • Provide training/mentoring to new entrants on support task.
  • Member of Standard Forum Committee for application development, quality and support process.

Confidential Detroit, MI

Data Engineer/ETL Lead Worked

Responsibilities:

  • dis project was implemented using Agile methodology
  • Worked on Datalake Integration wif Salesforce using Informatica Cloud Services
  • Responsible for Design and Architecture of Datalake on BigData Hadoop platform
  • Worked on NoSQL Unstructured databases like Hive and Impala for data querying
  • Used Informatica BigData ETL to develop the code for data extraction, integration and loading in Datalake
  • Worked on Data Quality process using Informatica Data Quality ETL product to cleanse and standardize Customer profiles
  • Expertise on Data Stewardship role
  • Responsible for Informatica BigData and Hadoop environment setup for DataLake
  • Conducted various POCs wif Vendors during initial phase of BigData DataLake
  • Worked on various long and short term projects as ETL Developer
  • Was a part of OnCall and Maintenance team
  • Worked on ETL code deployments during project release
  • Provide Support of existing applications and investigate/troubleshoot production issues

Confidential, Dallas, TX

ETL Consultant

Responsibilities:

  • As Application Developer and Functional Coordinator, supporting different applications related to Telecommunication products. These applications were previously handled by other vendor.
  • Involved in Application transition and transformation phases from other vendor.
  • Issue Analysis and propose solution.
  • Scheduling and monitoring ETL jobs by Informatica Scheduler.
  • Execute the test set up to check if requirements are met.
  • Prepare unit test result document.
  • Facilitating issue resolution and risk identification in different development and enhancement activities in the team.
  • Production Support, Major enhancements and maintenance of existing Data warehouse.
  • Responsible for translating business requirements to technical requirements and creating technical design documents and test plans for the Data warehouse objects.

Confidential

Informatica Developer

Responsibilities:

  • As a ETL developer responsible for development, testing and enhancement of existing & different Data ware housing Applications
  • Served as the Focal Point for the entire system enhancement team for all the issues related wif event processing modules (Stage area, Operational data Store, Foundation data mart and reporting data marts.)
  • Deliver new and complex high quality solutions to clients in response to varying business requirements.
  • Issue Analysis and propose solution.
  • Scheduling and monitoring ETL jobs by Informatica Scheduler.
  • Execute the test set up to check if requirements are met.
  • Prepare unit test result document.
  • Perform string/volume testing if required.
  • Production Support, Major enhancements and maintenance of existing Data warehouse

We'd love your feedback!