Database Engineering Consultant Resume
5.00/5 (Submit Your Rating)
Pleasanton, CA
SUMMARY:
- Experienced, and detail oriented Senior Database Engineer with experience in Big Data technologies. Design, Develop and enhance existing and new data marts in OLTP or OLAP schema. Highly skilled in Data Modeling, Reverse Engineering, Refactoring, Performance and Tuning, NoSQL, Data Cleansing, Accuracy, Stitching, Conversion, Integration, Reconciliation, Data Quality, Meta Data Management, Data Privacy and Security.
- 10 years of experience with Data flows, Data Architecture (OLTP and Dimensional) and ETL
- Experienced in Full SDLC: Agile execution environments
- Familiarity working with data science and machine learning teams
- Functional understanding of enterprise data warehouse and business intelligence and SQL
- Experience in ETL services (Sourcing, Data Modeling, ETL/Mapping/Transformations/Workflows, Infrastructure, BI, Reporting and dashboards)
- Ability to Develop/Code complex database processes with PL/SQL containing batch jobs
- Strong coding skills using SQL, PL/SQL (package, procedures, functions, triggers) and batch programming skills such as bulk collections, dynamic SQL, and parallel processing
- Experience in gathering, prioritizing and breaking down User Stories and allocating to technical resources using Excel, Jira, Rally, Service Now, etc.
- Experience in creating technical designs from business requirements, creating data mapping, data dictionary, data structures that can be used in Java and other 3GL programming languages, etc..
- Experience developing and deploying applications to a public cloud like AWS.
- Uses Erwin, Visio, SQL Modeler etc. to model data or to reverse engineer existing data
- Hands on Coding experience in Java, C, Python, Shell, SQL, PL/SQL etc
- Hands on experience in ETL tools like Informatica
- Experience with Data Visualization and BI Reporting tools like Tableau, QlikView, Microstrategy, etc.
- Oracle (11g/12c) expert with good experience in SQL Server, PostgreSQL, Teradata, HBase & MongoDB
- Experience with big data technologies - HDFS, Pig, Hive, Impala, Hue, Ambari, Oozie, etc
- Experience building pipeline with Apache Kafka Producer and Consumer APIs
- Experienced in using database tools like TOAD, SQL Developer or others
- Experience in Performance and Tuning and manages SLA’s for all data sets and processes.
TECHNICAL SKILLS:
- C/C++ Java Python SQL Shell Scripting PL/SQL JavaScript HTML5
- Oracle SQL Server DB2 UDB PostgreSQL Teradata NoSQL databases JIRA Rally Agile/SCRUM
- Enterprise Scale Systems Test Automation Data Analytics Business Intelligence
- ETL Informatica Hadoop Sqoop Flume DWH Oracle ERP TOAD SQL Developer
PROFESSIONAL EXPERIENCE:
Database Engineering Consultant
Confidential, Pleasanton, CA
- Leading a talented Data Engineering and analytics team for Ansit Clients: Albertsons and Lipman Produce.
- Lead the onsite and offshore ETL team in developing ETL interfaces for converting data from one store system to another store system.
- Worked with Business in identifying requirements and delivering ETL solutions on time in collaboration with Project Management, ETL Team and Data Governance team.
- Reduced Exception Reporting time by creating QlikView Dashboards and wrote 100+ complex SQLs.
- Created ETL Workflow to run ETL script followed by Kafka Producer API call to activate the data pipeline
- Helped to create QlikView Dashboards and wrote complex SQLs for 100+ QlikView reports.
- Maximized ROI on transportation management by building a proprietary big data analytic platform using Hadoop ecosystem for the trucking division.
- Made truck scheduling & tracking automatic and seamless by creating a mobile app & platform strategy.
Database Engineer
Confidential, Palo Alto, CA
- Supported data scientists by provisioning curated structured and unstructured data from various sources.
- Designed and implemented a real time networking log analysis platform using Kafka, Spark and Hive with data storage in HDFS to identify faults, events, capacity, configurations and the security of infrastructure elements.
- Involved in the my health application development using Hadoop and MangoDB with Java
- Created data models for various healthcare applications
- Migrated many Oracle based applications (written in PL/SQL) to PostgreSQL (PL/PgSQL) for saving on licensing costs
Sr. Data Architect
Confidential, Foster City, CA .
- Data Modeling from scratch for new systems and Reverse engineering existing data for Confidential Loyalty programs (Rewards, Sweepstakes, etc), Mobile Payment Platforms (Apple Pay), etc.
- Attend meetings with business users to understand business requirements, work with solutions architect to come up optimal data models that can be used by Java/J2EE Application developers, ETL developers, and Microstrategy Reports developers.
- Perform Data Analysis to come up with trends and business recommendations.
- Create conceptual data models for Business and management planning, integration and validations.
- Participates in the enterprise architectural strategies, standards and processes meetings.
- Ensured Naming standards are followed, data model is scalable, reusable, and performance oriented.
- Built a Meta Data Management system to keep the lineage of Enterprise Data Elements and to produce data flow reports.
DWH Engineering lead
Confidential, San Jose, CA
- Avoided costly data mismatches by revamping Informatica based ETL Process and ensuring data from multiple sources are synced on time. Added shell scripts run by Dollar Universe for monitoring the same and alerting management and support teams.
- Led the effort to build four search services for SNTC -Serial Number Search and Validation, EOL and alerts notification, Device Search and validation against maintenance contracts and all these JAVA RESTful services against a 20TB+ database (large data sets) through Active MQ messaging. These web services had a customer base of 5000 customers (with tens of users per customer). These services were data sources for CSTG Data Pipeline.
- Forecasted demand for network devices by creating Sqoop ETL process to load networking devices, customer and IP traffic data into Hadoop and by analyzing customer device requirement based on age of the device, network traffic, EOL, bandwidth etc.
- Created on demand end-to-end data marts using SQL Loader, SQL, PL/SQL and shell scripts based ETL for multiple product teams to perform predictive analytics and reporting to target customers who would need new devices and maintenance contracts based on devices nearing end of life, added traffic etc.
- By working with product management developed end to end data warehouse and data virtualization solution to provide “one view” of contracts and underlying networking devices to thousands of customers (with tens of users per customer). Created data models (STAR schema) for the same using ERwin.
- Led the effort to build four Java web services based search services-Serial Number Search and Validation, End of Product lifecycle and alerts notification, Device Search and validation against maintenance contracts from ERP systems and wrote back-end SQL and PL/SQL code for all these searches against a 20TB+ data warehouse. These services acted as Data Pipelines for many Confidential SNTC offerings.
- Reduced development time by tweaking and improving development processes, enhancing team engagement and communication, delivery and efficiency.
- Reduced Customer Escalations by training Data Analysis team to monitor data quality and discover data issues before customers are impacted.
- Managed business unit budget allocated for data projects and operations and was responsible for management of infrastructure required to build quality ETL applications.
Database Engineer
Confidential, Sacramento, CA
- Designed and developed emission testing programs for various State Motor Vehicle Departments (DMVs). This involved design, modeling, writing PL/SQL packages, triggers and stored procedures that interact and application development using Java and C. Messaging was implemented using C daemons. Each smog check comes as messages which are routed to appropriate applications for validations, tampering verifications and certifications.
- Enhanced Order Entry applications for DMV Call Centers written in Java. Modified the code to fix various SQL and Data mapping issues. Used Struts Framework
- Created a Vehicle VIN or License plate Search Web Service to validate vehicles before emission testing is performed. Later enhanced the search algorithm and used to provide data for services like Carfax.
- Developed C Programs, SQL, PL/SQL, SQL*Loader and Shell (bash) Script based and Informatica based ETL for Emission DWH which is used for generating compliance Reports for multiple DMVs.
- Created Data Models for Emission Programs and vehicle compliance reports using Oracle Case and Erwin.
- Worked as a development DBA to create DDL, import data from production and clients, grant privileges, archiving and purging scripts, Performance Tuning and optimization, etc.
- Implemented Data Marts using Dimensional modeling (STAR schema) and data is loaded into staging database using SQL*Loader and external tables, before aggregating to the main database.