We provide IT Staff Augmentation Services!

Sr. Cloud Engineer Resume

0/5 (Submit Your Rating)

NyC

SUMMARY

  • Data and Analytics Engineer with 5 years of experience in successfully leading, designing, developing and implementing large scale Data Analytics projects with AWS, Salesforce, SQL, Databricks.
  • Demonstrated ability to lead initiatives and align business and technology goals in diverse platforms and industries.
  • Proven track record in successfully managing multiple priorities in critical time sensitive environments. Deep technical and business capabilities to manage and develop strategic technology solutions.
  • Experienced in Data Lake, Lake House, Data Warehouse, Business Intelligence, Big Data and Data Integration technology. Experienced in data architecture design and development ETL solutions.
  • Developed and Customizing salesforce.com application based on User needs. Developed field layout & page layout customization for the standard objects like Account, Contact and Leads.
  • Design cloud migration and ETLs pipeline patterns for data Lake
  • Hands on experience with different AWS services like S3, EC2, SNS, Lambda, Data pipeline, Athena, AWS Glue, S3 Glacier, Cloud Formation.
  • Experienced in Data Migration & Data Engineering in Big Data Application Development.
  • Designed Distributed algorithms for identifying trends in data and processing them effectively.
  • Involved in Creation of tables, partitioning tables, Join conditions, correlated sub queries, nested queries, views, sequences, synonyms for the business application development
  • Understanding the existing business processes and Interacting with Super User and End User to finalize their requirement.
  • Design and Develop of Logical and physical Data Model of Schema Wrote PL/SQL code for data Conversion in there Clearance Strategy Project.
  • Developed database triggers, packages, functions, and stored procedures using PL/SQL and maintained the scripts for various data feeds.
  • Created Indexes for faster retrieval of the customer information and enhance the database performance.

TECHNICAL SKILLS

  • Salesforce
  • AWS
  • S3
  • IAM
  • Databricks
  • EC2
  • SQL
  • Lambda
  • Cloud Formation
  • GitHub
  • AWS RDS
  • AWS Glue
  • AWS Athena
  • Databricks
  • Delta Lake
  • Oracle
  • SQL - Server
  • Python.

PROFESSIONAL EXPERIENCE

Confidential, NYC

Sr. Cloud Engineer

Responsibilities:

  • Extensive experience in Amazon Web Services (AWS) Cloud services such as EC2 and S3.
  • Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.
  • Built a VPC, established the site to site VPN connection between data center and AWS.
  • Management and Administration of AWS services CLI, EC2, VPC, S3, ELB Glacier, Route 53 and IAM.
  • Amazon IAM service enabled to grant permissions and resources to users. Managed roles and permissions of users with the help of AWS IAM.
  • Configured AWS Multi Factor Authentication in IAM to implement 2 step authentication of user’s access using Google Authentication and AWS Virtual MFA.
  • Included security groups, network ACLs, internet Gateways and Elastic IP’s to ensure a safe area for organization in AWS public cloud.
  • Involved in designing and developing Amazon EC2, Amazon S3, Amazon RDS, Amazon Elastic Load balancer and other services of the AWS infrastructure.
  • Experience DynamoDB in integrating oozie logs to kibana dashboard.
  • UsedDynamoDB to store the data for metrics and backend reports.
  • Developing predictive S3, SNS, and SQS analytic using Apache Spark Scala APIs.
  • Experience in AWS, implementing solutions using services like (EC2, S3, RDS, Redshift, VPC)
  • Implemented AWS Object Oriented Programming,JavaCollections API, SOA, design patterns, Multithreading and S3, Data build tool, SNS, SQS Network programming techniques.
  • Worked on Lambda, PostgreSQL tools Flume, Storm and Spark.
  • Developed Databricks ETL pipelines using notebooks, Spark Data frames, SPARK SQL and python scripting.
  • Recreating existing application logic and functionality in the SQLDatabase and SQLDatawarehouse environment.
  • Creating Databricks notebooks using SQL, Python and automated notebooks using jobs.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs, Python, and Scala
  • Involved in developing the Spark Streaming jobs by writing RDD's and developing data frame using Spark SQL as needed.
  • Experience in Agile Development Methodology.
  • Supported QA Team by fixing defects. Highly knowledgeable in using bug fixing tools such as JIRA.
  • Experience in code repositories such as Git.

Environment: AWS, Databricks, SQL, Java, JIRA, Python.

Confidential, Tampa, FL

Sr.Data Engineer

Responsibilities:

  • Create campaign to lead forms, assign tasks and manage workflows through Salesforce.
  • Responsible for gathering the requirement from data architect and business analyst.
  • Created code to perform various actions on data in S3.
  • Designing complex SQLs.
  • Develop new email campaigns based on business needs.
  • Maintain the existing running email campaigns and modify based on the requirement in the optimal solution.
  • Converted the functional specification into low level technical design document.
  • Write the bash script for data pipelining.
  • Involved in loading data from Amazon s3 to RDS.
  • Worked with the data in s3 to visualize the data file creation.
  • Developed code to perform various actions on data in S3.
  • Responsible for code review and development
  • Responsible for unit testing
  • Prepared integration test script for data validation in test environment.
  • Worked on building and developing ETL pipelines

Environment: Python, AWS, SQL Sql-Server, Salesforce

Confidential, Lake Mary, FL

Data Engineer

Responsibilities:

  • Worked as an implementation lead for this application in an off-shore and onshore delivery model which involves, understanding the client’s requirement and discussion with data modeler.
  • Converted the functional specification intolow level technical design document.
  • Co-ordination between offshore team and assigning the delivery task.
  • Technical help to off-shore team.
  • Developed the complex ETL mappings.
  • Defining the job dependency to be run.
  • Code review, data validation and testing.
  • Data conversion ETL processes

Environment: Informatica, Oracle, workday, SQL

Confidential

Data Analyst

Responsibilities:

  • Writing SQL Queries to fetch required data from Database tables as per requirement
  • Used SAS for pre-processing of data, data extraction, validation & manipulation.
  • Analyze and ensure efficient transition of all technical design documents and develop various SQL packages to provide support to all application developer.
  • Responsible for reports, which are scheduled on daily/weekly and monthly bases.
  • Report automation, developing macro, and maintaining Macro.
  • Used SAS /Base for creating, sorting and updating reports.
  • Analyze data team quality performance by applying Excel formulas and create graphs/charts on weekly trackers and compliance reports, to identify the weak strings and measure return on investment for business.
  • Active participation & leading Weekly Project Status Meetings.
  • Creating effective meeting agendas in order to capture appropriate information, needs, and concerns.

Environment: SAS, SQL, Oracle, UNIX, Windows

We'd love your feedback!