Software Developer Engineer Iii Resume
4.00/5 (Submit Your Rating)
SUMMARY
- 9 years of extensive experience in IT industry, worked in all phases of SDLC, requirement gathering, analysis, application design, development, testing, implementation and maintenance.
- Big Data technology includes development experience with Hadoop (Hive, Pig, Sqoop, Flume and Oozie).
- SQL and PL/SQL programming experience include packages, stored procedures, functions, triggers, indexes, External table, SQL Loader and exception handling.
- Scripting language experience includes writing Shell Scripts with bash using VI and Nano editor.
- Domain experience includes BI Analytics, Dataware houses, Supply Chain, Retail - Marketing and Gift-cards.
- Extensively worked on troubleshooting, Query optimization and performance tuning on multiple projects.
- Experience in preparing unit test cases, conduct unit testing, assembly testing, system integration testing and provided support to business acceptance testing and post implementation support.
- Experience in Configuration Management, Release Management activities.
- Experience in effectively handling team of 6-8 members
- Experience in trouble shooting failed mission critical production systems under extreme pressure conditions and time constraints including emergency code fix.
- Experience in production support model including Batch Support, Data Fix, analyzing issues and generating Ad-hoc reports to customer.
- Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
- Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
- Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience optimizing ETL workflows.
TECHNICAL SKILLS
Languages & Technologies: Hive, Oracle PL/SQL, Hadoop, Pig, Sqoop, Flume, Oozie, Java
Operating Systems: Mac OS, Linux, Windows
Scripting: Unix Shell Script
Databases: HDFS, Redshift, EMR, AWS S3, Oracle 11g/10g/9i.
Domain: Supply Chain, Internet Applications, Retail, BI Analytics
Config. Management: CVS, SVN, Accupass
Scheduling Tools: Autosys, Crontab
Other Tools & Utilities: SQL Developer, SQL Loader, Toad, Eclipse
PROFESSIONAL EXPERIENCE
Confidential
Software Developer Engineer III
Responsibilities:
- Created ETL scripts/Data pipeline to load data from multiple sources to the target databases EMR & Redshift.
- Shared responsibility for administration of various AWS applications.
- Designed the schema, fact & dimension tables.
- Performed performance tuning and Application tuning.
- Created multiple shell Scripts to perform various task (like load into AWS) & for scheduling cron jobs.
- Created Hive queries that helped business analysts to spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Usage of various hive properties helped in performance improvement of query.
- Conversion from text file format to ORC file format storage with the table property zlib helped in increasing the query efficiency.
Confidential
System Analyst III
Responsibilities:
- Created scripts with several HQL queries to generate reports for iTunes billing related data using Hive.
- Imported/Exported data from Hive to SQL table and vice-versa using Sqoop.
- Performed HQL tuning and Application tuning using Bucketing and SMB Join.
- Created PL/SQL stored procedures, functions and packages.
- Created Pig script for loading/storing data from local directory to HDFS.
- Created package to send daily reports through e-mail using shell Scripts.
- Created Oozie workflow to run the HQLs through with the help of Autosys scheduler.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Managed and reviewed Hadoop log files.
- Shared responsibility for administration of Hadoop, Hive and Pig.
- Usage of various hive properties helped in performance improvement of query.
Confidential
Sr. Database Developer
Responsibilities:
- Created PL/SQL stored procedures, functions and packages according to the project and client requirements.
- Created scripts with several hql queries to create report for pricing related data using Hive.
- Created B-tree and Bitmap indexes on the tables for faster retrieval of the data to enhance database performance.
- Performed PL/SQL tuning and Application tuning using tools like EXPLAIN PLAN, SQL TRACE and AUTOTRACE.
- Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
- Usage of Analytical function like Rank, Listagg, Lead, Lag, Ntile Function to get the most recent record from table having transactional data.
Confidential
Sr. PL/SQL Developer
Responsibilities:
- Usage of Rank function to get the most resent record from table having transactional data.
- Extensive use of MD5 Checksum to compare the new records with existing records row by row.
- Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
Confidential
Technology Analyst
Responsibilities:
- Involved in SDLC gathering requirements from end users.
- Created PL/SQL stored procedures, functions and packages according to the project and client requirements.
- Created Unix Shell Scripts to monitor the system for Oracle stream errors and other errors as per the client and project requirements.
- Created Various Autosys and Crontab Jobs for automatic processing of reports and other processes.
- Used advanced bulk techniques (FOR ALL, BULK COLLECT) to improve performance.
- Used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
- Created IMPLICIT CURSORS, EXPLICIT CURSORS, and REF CURSORS.
- Partitioned the historical tables to enhance the performance and used bulk collection in PL/SQL objects for improving the performance.