Sql/data Engineering/ Dba Resume
Lincolnshire, IL
SUMMARY
- SQL/Data Engineering/ DBA with nearly 13+ years of experience designing/developing innovative applications and achieving high client satisfaction. Solid understanding of key concepts, such as PL/SQL Triggers, Procedures, Functions, Cursors, Data pump jobs, DBMS schedulers to call shell scripts via oracle procedures, Pipe Row functions
- Worked as a Team/Technical Lead for 5 years.
- Performed performance tuning services for PL/SQL applications
- Fine tuned long running pivot/un pivot queries (converted them into pipe row functions) to avoid huge temporary table space usage and fine tuned long running merge queries & insert queries reduced timing from hours to minutes
- Experience in Reports Development and Data Migration from CSV to Oracle Database using SQL Loader through scheduled batch operations. Worked extensively with JSON format files.
- Deep knowledge of System and database architecture: design, modeling and implementation
- Experience in designing and developing Oracle PL SQL procedures, functions, packages, Visual Basic, UNIX shell scripting. Excellent communications and interpersonal skill.
- Understanding Internet, Intranet and Extranet Environment.
- Fully trained in the SIX SIGMA Methodology and Lean Sigma, tools and techniques. Equipped to lead a process improvement project
- Experience in SQL tuning of medium/large database environments.
- 1+ years of experience in Hadoop programming, creating hive and pig scripts for large sets of structured, semi - structured and unstructured data. Worked with serialization and de serialization for JSON format files using hive table.
- Worked on using Sqoop import/export to load data from HDFS and Hive to Oracle tables, Teradata and vice versa.
- Extensive experience in UNIX shell scripting and Hands on Python scripting, ETL tool Informatica.
- Implemented Unit Testing and Integration testing during the projects
- Participated in all phases of the project life cycle ranging from analysis, design, development, production maintenance and operation support.
- An effective communicator with excellent relationship building & interpersonal skills. Strong analytical, problem solving & organizational abilities.
- Extensive SDLC experience including Requirement Analysis, Preparations of Technical Specifications Document, Design and Coding, Application Implementation, Unit testing and System Testing, Documentation, Production Support, Functional and Regression Testing.
- Was Working at client place in various roles, Hutchison 3G ( Confidential )/ Ericsson (Reading), British Telecom (Cardiff) United Kingdom for nearly Three Years.
- Provided 24x7 production support for troubleshooting mission critical applications.
- Worked in Hortonworks data management, Good experience with Python Pig Sqoop Oozie Hadoop Streaming and Hive, Solid understanding of the Hadoop file distributing system
- Extensive knowledge of ETL including Ab Initio and Informatica
- Vast experience with Java Puppet Chef Linux Perl and Python
- In-depth understanding of MapReduce and the Hadoop Infrastructure
- Focuses on the big picture with problem-solving
TECHNICAL SKILLS
Databases: Oracle 8i/9i/10g/11g/12 C
Tools: and Utilities: Erwin Data Modeler 8.2, Toad 11.5,PL/SQL Developer 8, Oracle SQL Developer 3.1, DB Visualizer 8.0, SQL loader, Data pump, TKPROF, Import and Export, Statspack, sql server
Internet: HTML, XML
Middleware: IIS 6.0/5.0
Programming Languages: VB 6.0, VB Macro (VBA 7.0)
Scripting Languages: Unix Shell Scripting, AutoSys 11.0
Design Tools: Microsoft Visio 2010, Erwin data modeler
Version Control tools: Visual Source Safe 2010, Star Team 12.5, Sub Version 1.7, Clear Case 8.0, SCCS
Build Tool: TFS 2010, Answers Desktop, DB Look, Micro strategy 9.3, WinSCP
Quality Tools: HP Quality Center 11.0, RTC, JIRA
Reports: Business Objects v6.5, Business Objects Enterprise XI Release 2, Crystal Reports 9.0, OLAP cubes
Data Warehousing: Netezza (10k series, Twinfin)
Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop
PROFESSIONAL EXPERIENCE
Confidential, Lincolnshire, IL
SQL/Data Engineering/ DBA
Environment: - Oracle 11g/10g, SQL Developer, TOAD, HADOOP SQL, PL/SQL, LINUX (RHE6.3), Python, Hadoop (Hive and Pig Scripts), JIRA, WinSCP, CRON Scheduler, Putty, JSON files
Responsibilities:
- Worked on Genesis ecommerce platform which hosts 15 web sites in 10 countries/languages. The Objective of Genesis Ecommerce service operation is to enhance customer experience, improve site stability, performance and increase business profitability.
- Created stored procedures and functions using PL/SQL for stream lining the current processes and for data transformations and Worked on Linux 6.3 to create shell scripts and to automate job processes on crontab.
- Involved in setting up data loads and jobs for customers, product, orders and email subject areas for 15 countries in Europe-Staples that includes loading the data from ERP systems into Data warehouse and BI data system for reporting.
- Converted huge data loads from Oracle to Hadoop.
- Involved in writing hive queries and pig to parse raw data, populate staging tables and export the data using Sqoop export and store the refined data in partitioned tables in BI data system.
- Worked on using python scripting to create dynamic scripts for automated data load processing.
- Involved in performance tuning for various jobs using tuning techniques like Explain Plan, Hints.
- Provided Production Support for running daily batch jobs and fixing job failures.
- Provided solutions to Production issues for fixing data issues and Involved in Migration of Oracle Sql scripts and UNIX Shell scripts,
- Worked on using SQL Loader to load flat files into Oracle database.
- Worked on Sql developer for writing complex SQL queries to provide data analysis reports between the current environment and upgraded environment to make sure there are no discrepancies in data.
- Worked on using WinSCP for transferring files between two environments(FTP)
- Worked with DBA’s for database upgrades and troubleshooting performance issues.
- Involved in team discussions and provided solutions when required and Involved in System Analysis, Design, Mapping, Coding, Data conversion, Development and Implementation.
- Performance tuning of PL/SQL procedures and SQL queries and Defining checklists for coding, testing & deliverables.
- Developing the High Level Design for the enhancements.
- Involved in Designing schema and schema objects, coding in PL/SQL and Shell Scripting, Review of Test Cases and System Testing.
- Developed various Complex Queries, Views for Generating Reports and created dyanamic procedures for various process in the project to load different tables.
- Tuned SQL queries using Explain Plan and TKProf utilities to improve performance.
- Fine-tuned the logic in Procedures, Functions and Triggers for optimum performance.
- User Acceptance Testing support at the Client Site.
- Handling Problem reports while migrating the application to production.
- Defining best practices for project support and documentation.
- Re-org the large volume of tables to achieve better performance.
- Converted some of the intensive processes to Hadoop using Hive, Sqoop and Pig. Sqoop exported data from sql server to Hadoop for faster processing and sqoop back to oracle for reporting.
- Created shell scripts, unix function using vi editor. Hands on knowledge of utilities like find, grep, sed, etc. Followed Agile methodology for development activity.
- Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Supported code/design analysis, strategy development and project planning.
- Created reports for the BI team using Sqoop to export data into HDFS and Hive.
- Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
- Assisted with data capacity planning and node forecasting.
- Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.
Confidential, Boston, MA
SQL/Data Engineering/ DBA
Environment: - Oracle 11g, PL/SQL, UNIX, Autosys
Responsibilities:
- Interact with the requirements team during development and with users during implementation and analyze new business requirements or enhancements or change.
- Worked on the new change requests and implemented them successfully.
- Support for the client taking new change request and bug fixing after implementation stage.
- Involved in performance tuning. Converted all Delete/Insert statements to Merge/Delete for all fact tables as part of performance tuning.
- SQL Performance tuning was done using Explain plan, correct Index creation, Analyzing table stats and creating the right data model.
- Performed Oracle DBA tasks such as monitoring the database, creating users, granting roles and grants, application build, run conversion scripts.
- Used development tools such as Toad and SQL Developer for writing code and executing the queries etc.
Confidential, Schaumburg, Illinois
Environment: - Oracle 11g, PL/SQL, Netezza, UNIX, Oracle APEX
Responsibilities:
- Support for the client taking new change request and bug fixing after implementation stage.
- Used DBMS SCHEDULER to set batch jobs for processing large amount of data in parallel.
- Some data in the conversion process was provided by third party application in very large files. Used SQL Loader for loading these files in the database.
- Wrote UNIX scripts to automate the conversion process where various conversion processes were put together and called in an order. The process was monitored and it would provide online progress of the process.
- Parallel batch processing was used to drastically reduce processing time for loading millions of bank transaction records. Algorithms were written to split the records in various batches and processing them separately.
- SQL Performance tuning was done using Explain plan, correct Index creation, Analyzing table stats and creating the right data model.
- Used development tools such as Toad and SQL Developer for writing code and executing the queries etc.
- Worked with the version control application Subversion, plugged in using Eclipse.
Confidential, Schaumburg, Illinois
Environment: - Oracle 11g, PL/SQL, UNIX,Netezza
Responsibilities:
- Directly engaged with different clients (Kimberley Clarke, Kraft, Schwan) and trying to understand their POS data
- Worked on Netezza to load data from Mainframe systems. Wrote Korn shell scripts to call PL/SQL procedures, functions, triggers.
- Wrote Autosys for the orchestration of the total workflow
- Worked on Data Analysis and Data Mining
- Understanding the systems behavior in analyzing the POS data at granular levels
- Able to deliver data in any desirable report format
- Involved in the review of Requirement document.
- Involved in the review of Functional specification documents and performing estimations.
- Team Lead for handling the production tickets and the change requests.
- Back-end application development using Oracle 11g PL/SQL. Develop stored packages, procedures, and functions; create tables, views, constraints, foreign keys and indexes.
- Created conversion scripts, and executing them over very large set of data. The amount of data ranged from 50 million customer accounts to 500 million transaction records. SQL tuning methods such as explain plan, correct Index creation and hints were used to option optimal performance.
- Extensively used of advanced PL/SQL concepts such as collections and bulk processing to achieve best performance. These were used in large data conversion scripts, to process records within a procedure or package before updating them to the database.
- Oracle data warehousing was used to analyze large amounts of data without affecting the production database. Materialized views were used to make a copy of transaction records from the production system daily, this was then used to analyze and generate reports for analysis.
Confidential
Environment: - MS Project Professional, MS Viso, EPM
Responsibilities:
- Team Lead for handling the production tickets and the change requests rose.
- Was instrumental in increased the team size from 5 people to 25 people.
- Was instrumental in getting EPM implementation project at client place for.
- Pivotal in the creation of a centralized PMO and PSO that has recognized value in supporting the strategic programs.
- Achieved budget by promoting goods receipting to ensure costs were accrued.
- Increased accuracy of supplier time recording from 46% to 100% providing financial control and improved cash flow.
- Implemented an improvement to the provision of project codes saving 2 man days effort per month.
- Supported the recruitment, and led the on boarding, training and development of new starters.
- Represented Program Office as business partner to Finance Department.
- Devised and delivered training catalog for the Program Office team including formal training, self-service “How to” guides and key stroke level work instructions.
- Developed PMO brand, intranet site - People, Process, Data, and Tools.
- Deputized for PMO Manager as required in all activities.
- Provided secretariat for investment board meetings.
- Award for recognition - Exemplary H3G behaviors (Open and Honest, Keeping your word, Taking responsibility)