We provide IT Staff Augmentation Services!

Senior Cloud Data Engineer\database Administrator Resume

5.00/5 (Submit Your Rating)

Malvern, PA

SUMMARY

  • 12+ years of IT industry experience DB2 UDB, Oracle, Postgres, MySQL Database Administration & Infrastructure Services with good knowledge of Cloud computing, AWS services, AWS deployment\Migration
  • 2+ years of experience with AWS VPC, EC2, S3, EBS, RDS, ELB, SNS, AMI, IAM, EMR, Route 53, Cloud Watch, and Lambda
  • Experience in AWS cloud computing platform, migrating databases to Amazon RDS
  • Hands - on Experience AWS DynamoDB, Kafka, AWS Kinesis, Elasticsearch, Data pipeline, EMR, Redshift, Machine Learning, Spark, Hadoop, Hive, Python
  • Design, develop, and implement Oracle database instances on AWS for the development and production environments
  • Test, validate and implement performance and resource optimization improvements in consultation with AWS development Teams
  • Good knowledge of Building private AMI’s on VPC’s
  • Proficient in High Availability and Disaster recovery implementations
  • Backup and recovery (RMAN and traditional), Export and Import using data pump
  • Hands-on experience with AWS solutions including S3, Kinesis, Lambda, EMR, DynamoDB, AWS Redshift, Spark, RDS
  • Solid understanding of ETL architectures, data movement technologies, database partitioning, database optimization, and building communication channels between structured and unstructured databases.
  • Knowledge of data management including data warehousing & statistical modeling.
  • A strong understanding of data profiling and data cleansing techniques
  • Hands-on experience working with large complex data sets, real-time/near real-time analytics, and distributed big data platforms
  • Good Knowledge on NoSQL Database like Mongodb, Cassandra, HBASE, HIVE
  • Automated manual tasks using shell scripting and python
  • Good understanding of Big data and R Programming & columnar database
  • Experience in R programming, statistical modeling and analysis in R, Data Analysis
  • Migrated around 500 databases from old to newer versions (V9.1, v.9.5 to v.9.7, 10.1, and 10.5) and applied FIXPAKS to keep DB2 at most recent level
  • Proficient in ITIL methodologies, Support process, SLA Guidelines

TECHNICAL SKILLS

Technology: DB2 LUW 9.x, 10.x, DB2 BLU, cognos, Sql server, No SQL, TSM Cloud, AWS RDS, EC2, S3, IAM,EBS, ELB, VPC, Route53,Cloudwatch, Lambda big data, data lake, Data pipeline, AWS redshift, AWS, Kinesis, Lambda, DynamoDB, EMR, data architecture, Hadoop, Python, Hive, SQL, Sqoop, ETL, data modeling, database partitioning, data profiling, Replication, cloud formation, AMI, Elastic bean stack, High Availability, On premise-cloud database migration

Database Tools: IBM Guardium, IBM OPTIM, Data Studio, DB2 Connect

Operating System: AIX, Linux Windows, Solaris, UNIX

ITIL Tools: Service Now, Maximo, Remedy

Monitoring Tools: Tivoli Monitoring, Net cool, Cloud watch

PROFESSIONAL EXPERIENCE

Confidential, Malvern, PA

Senior Cloud Data Engineer\Database Administrator

Responsibilities:

  • Designed cloud formation template for RDS (MySQL, postgres, Sqlserver) databases and spin up multiple database for multiple SI app teams for microservices and client applications
  • Designed strategy to implement lambda functions across the environments and incorporated with in cloud formation templates
  • Migrated databases from on-premise oracle to cloud rds mysql\postgres
  • Implemented infrastructure as a code, designed cloud formation code, promoted code to various environments and thus databases are spin up RDS instances with a single button click deployment approach
  • Maintained infrastructure code in BITBUCKET, GITHUB repositories and promoted code through various environments (TEST, UAT, PROD) through BAMBOO deployment process.
  • Work specific to JIRA STORY, Create a branch through JIRA on bitbucket repository, modify the python code and cloud formation template in local STS and then push the changes to remote branch and then raise a pull to merge changes to master and get approvals and do the deployment in Test and production through Bamboo
  • Designed and implemented cloud watch alarms and configured metrics for various RDS instances
  • Automated database audit reports to trigger report of RDS instances and LOB details
  • Implemented cost optimization mechanism by designing lambda functions to auto shutdown RDS instances during non-business hours based on cloud formation tags info
  • Designed and tested data load process in RDS instances ad monitored data flow mechanism through data sunrise product
  • Setup lambda function through SNS Topic and configured Notification process when ever a new RDS Instance created, deleted, minor version upgrade, modified
  • Designed self service documents for application teams for usage of cloud formation template so that they would be able to spin up rds instances them selves
  • Worked closely with application teams and helped in designing and providing various cloud solutions
  • Designed strategy and successfully migrated databases from on premise to AWS cloud.
  • Loaded data in to redshift tables from s3, dynamodb, EMR, EC2 using copy utility.
  • Leveraged data pipeline to move data in\out from redshift tables to RDS through unload\copy configured lambda to automatically copy files from s3 to redshift tables and automatically drop files from s3 buckets.
  • Designed redshift tables using KEY Distribution style, used sort keys.
  • Used WLM to work load management with in the cluster
  • Worked on maintenance utilities like vaccum & Analyze and improved performance of rds mysql, redshift cluster tables
  • Configured cloud watch metrics using cloud formation template and routed alarms through sns topics and event subscriptions to Tivoli view port \omnibus and splunk and integrated with service now to generate automated incident
  • Designed CloudFormation template to spin up redshift, EMR cluster
  • Implemented cost saving mechanisms to auto shutdown nonprod RDS instances, EMR cluster, Redshift cluster during non-business hours\window where there is no usage

Confidential, Mclean VA

Senior Cloud Data Engineer\Database Administrator

Responsibilities:

  • Worked on AWS services like EC2, S3, RDS, IAM, EBS, ELB, VPC
  • Involved in design and database migration strategy for on premise to AWS private cloud
  • Design, develop, and implement Oracle database instances on AWS for the development and production environments
  • Test, validate and implement performance and resource optimization improvements in consultation with AWS development Teams
  • Good knowledge of Building private AMI’s on VPC’s
  • Migrated databases from on-premise oracle to cloud rds mysql\postgres
  • Designed strategy and successfully migrated databases from on premise to AWS cloud.
  • Loaded data in to redshift tables from s3, dynamodb, EMR, EC2 using copy utility
  • Leveraged data pipeline to move data in\out from redshift tables to RDS through unload\copy configured lambda to automatically copy files from s3 to redshift tables and automatically drop files from s3 buckets
  • Hands-on experience in MongoDB cluster environment and AWS redshift data warehouse
  • Working on Remediation project and currently migrating DB2 Databases from 9.7 version to 10.5
  • Migrated DB2 corporate data ware house DPF database of 45+TB and 33 logical partitions from AIX DB2 9.7 to Linux 10.5
  • Worked with HPU unload\load to move data from source to target
  • Worked on EMC advanced database backup & restore mechanisms using DDBOOST
  • Designed and built new DPF DB2 10.5 database environments in Linux
  • Migrated Oracle database databases from 11g to 12C
  • Have done oracle database administration activities like Export Import using data pump
  • Worked on oracle database backup & restore using RMAN
  • Working with project teams to deploy their products into the AWS environment
  • Supported Db2 Databases of Single family & Multi Family applications
  • Performed Performance tuning, query tuning and improved performance of application
  • Performed database administration activities like backup, refresh, Runstats, Reorg, Capacity planning, Query tuning, Database design & build, Database monitoring, Troubleshooting & resolving issues
  • Performed performance tuning of database & improved performance of database
  • Analyzed & troubleshoot issues using db2daiglog files, db2 tools like db2pd, db2top, snapshot
  • 24\7 Production environment support

Confidential, Atlanta, GA

Senior Database Administrator

Responsibilities:

  • Proficient in DB2 Database installation and migration activity
  • Experience with IBM ISAS Data warehousing housing (DPF), Managed ISAS BI Appliance database of 95TB
  • Worked on Database Partitioning Features (DPF) Environment and worked on creation of range partitions and data partitions
  • Experienced in implementation of IBM PDOA System
  • Installed and Configured Hadoop and worked on Cloudera landscape
  • Worked with Column based & Row based architecture tables using DB2 BLU
  • Supported IBM Db2 Databases for Cognos application and helped in cognos reporting tools
  • Proficient in performance monitoring in DB2 and handling DB2 TSM backups
  • Created DDL statements for database objects (tables, index, UDF, vies etc.)
  • Set up DB2 HADR (provide High availability "HA" and scalability)
  • Work closely with Application Developers, Data Modelers, Engineering, Security Administrators, Capacity Planning & Monitoring, Service Desk Scheduling and Network Administrators as needed
  • Proficient in Performance tuning, oversee backups and create scripts for task automation
  • Troubleshooting database issues using db2diag, db2pd, db2top, Snapshot and event monitors
  • Setup High Availability Disaster Recovery Environment for multiple databases
  • Interpret DB2 administration and diagnostic logs
  • On call weekends and remote working 24x7x365

Confidential

Lead Database Administrator

Responsibilities:

  • Migrated Databases from Old version to latest Versions
  • Proficient in performance monitoring in DB2 and handling DB2 TSM backups
  • Good Knowledge of AIX operating system and handling DB2 databases on AIX
  • Performed Redirected restore using TSM & Flash copy backups methodologies
  • Implemented backup & recovery strategies
  • Administer, support and maintain all Informix database systems
  • Complete database, and other IBM Informix software, installs and upgrades
  • Expertise in using DB2 Movement Utilities like Export, Import, Load and db2move
  • Experience in using DB2 Maintenance Utilities like Reorgchk, Reorgs and Runstats
  • Performed SQL Query Tuning using db2explain and db2exfmt tools
  • Troubleshooting database issues using db2diag, db2pd, db2top, Snapshot and event monitors
  • Setup High Availability Disaster Recovery Environment for multiple databases
  • Experience on preparing project related documents, Knowledge Articles
  • Have done tuning of db and dbm configuration parameters and improved performance
  • 24/7 production environment support and involved in on call support.

We'd love your feedback!