We provide IT Staff Augmentation Services!

Senior Softare Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 11 Years of IT Experience in Analysing, Designing, Developing, Implementing and Testing of software Applications and Currently working as Senior Software Enginer.
  • 4 years of hands - on experience in Big Data/HADOOP Eosystems (HDFS,MapReduce,PIG,HIVE,SQOOP,Flume,Oozie,ZooKeeper).
  • 7+ years of hands- on experience in software design and development systems in Informatica ETL,Oracle EBS R12, Oracle PL/SQL, Mainframe, Python and Unix/Bashscripting.
  • Highly experienced IT professional with 11 years of commitment to excellence and the implementation of best practices worldwide specializing in Big Data (Hadoop Ecosystems) and Informatica (ETL).
  • Hands on experience on major components of Hadoop Ecosystem (Flume,ZooKeeper,Oozie Hive, Sqoop and PIG).
  • Excellent understanding of Hadoop architecture and different components of Hadoop clusters on Hotonworks or Cloudera.
  • Create and maintain HiveQL warehouse/mysql for Hive analysis.
  • Experience on python programing and Unix bash scripting in creation of scipts for hadoop enviornment.
  • Strong experience in Data structures and various data structures(Array, Stack, Linked list and Tree).
  • Extensively used Apache Sqoop for efficiently transferring bulk data between Apache Hadoop and relational databases (Oracle/MySQL).
  • Automated sqoop,hive and pig jobs using Oozie scheduling.
  • Helped business team by developing, installing and configuring Hadoop ecosystem components.
  • Configured & deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Developed multiple Kafka Producers and Consumers from scratch as per the business requirements.
  • Responsible for creating, modifying and deleting topics (Kafka Queues) as and when required by the Business team.
  • Developed tests cases and POC’s to benchmark and verify data flow through the Kafka clusters.
  • Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Experience in resolving on-going production and maintenance issues and bug fixes; monitoring Informatica sessions as well improving the performance and fine tuning of mappings and sessions
  • Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
  • Technically demnostroated and delivered various projects in Informatica ETL by following different meathodologies(Waterfall,Agile,Multi project generation).
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Good working knowledge with various Informatica client tools like Source Analyzer, Mapping designer,Transformation Developer,Repository manager and Workflow manager and exposure to Mainframe and UNIX shell scripting.
  • Strong understanding of verticals in Oracle Finace(Oracle EBS modules - AP,AR,PA,FA,OAT and GL).
  • Six Sigma Green Belt Certfication from Confidential Appliance & Lighting.

TECHNICAL SKILLS

  • Informatica(ETL)
  • Oracle EBS R12.2.3 (FA,AP,PA,OAT and VT)
  • Python 2.6.6
  • Oracle(11g), PL/SQL
  • Unix Shell Scripting

PROFESSIONAL EXPERIENCE

Confidential

Senior Softare Engineer

Responsibilities:

  • Extensly worked with Partitions, Bucketing tables in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Created and worked Sqoop jobs with full refresh and incremental load to populate Hive External tables.
  • Experience is writting python scripts and unix bash scripting.
  • Worked on Pig to do data transformations, event joins, filter and some pre-aggregations before storing the data onto HDFS.
  • Worked on creation of Oozie workflow for Daily ingestion jobs.
  • Developed Simple to complex Map/reduce Jobs using Hive,Sqoop and Pig.
  • Design and Develop Pig Latin scripts and Pig command line transformations for data joins.
  • Creating hive tables to the imported data for validation and debugging.
  • Experience in using Sqoop to migrate data to and fro from HDFS and My SQL,Oracle and Teradata.
  • Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and sqoop jobs for Daily Loads.
  • Designed and deployed big data analytics data services platform for Data injection into Hadoop Data lake.

Confidential

System Analyst

Responsibilities:

  • Extensly worked with Partitions, Bucketing tables in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Created and worked Sqoop jobs with full refresh and incremental load to populate Hive External tables.
  • Worked on Pig to do data transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS.
  • Worked with highly unstructured and semi structured data.
  • Developed Simple to complex Map/reduce Jobs using Hive and Pig.
  • Design and Develop Pig Latin scripts and Pig command line transformations for data joins and custom processing of Map reduce outputs.
  • Creating hive tables to the imported data for validation and debugging.
  • Experience in using Sqoop to migrate data to and fro from HDFS and My SQL or Oracle.
  • Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and sqoop jobs
  • Installed and configured Hive,PIG and Zookeeper and other Ecosystems as a part of POC.
  • Designed and deployed big data analytics data services platform for the injection to Hadoop

Confidential

System Analyst

Responsibilities:

  • Involved in design and development phases of Software Development Life Cycle (SDLC).
  • Developed data pipeline using Flume, Sqoop to ingest users behavioral data and purchase histories into HDFS for analysis.
  • Used Pig to perform data validation on the data ingested using scoop and flume and the cleansed data set is pushed into HBase.
  • Used Hive to analyze data ingested into hadoop environemnt and compute various metrics for reporting on the dashboard.
  • Developed job flows in Oozie to automate the workflow for pig and hive jobs.
  • Loaded the aggregated data onto MYSQL from Hadoop environment using Sqoop for reporting on the dashboard.

Confidential

System Analyst

Responsibilities:

  • Implemented several critical Informatica ETL load modules and worked closely with business clients to drive down closing defects to zero.
  • Extensively worked in ETL code using Informatica tool in order to meet requirements for extract and transformation and loading of data from source to target tables.
  • Extensively worked on Flat file source, target mapping, Source Qualifier,Aggregator, Lookup, Filter, Sequence generator, Router, Union, Update strategy etc., by implementing business rules and interfaced with various legacy source systems(Mainframe).
  • Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.
  • Created reusable transformations to load data from operational data source to Data Warehouse and involved in capacity planning and storage of data.
  • Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
  • Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.
  • Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Fine-tuned the process to improve Data Extraction, Data process and Load time.
  • Extensively worked on complex SQL Queries involving multiple tables with joins.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
  • Worked on Mainframe JCL and datasets to extract data from Legecy system into infromatica.

Confidential

System Analyst

Responsibilities:

  • Design, development of mappings, transformations, sessions, workflows and ETL batch jobs, shell scripts to load data into Sources and Targets using Informatica, PL/SQL, Unix Shell scripts.
  • Demonstrated technical expertise in migrating the configurations from one instance to other instance.
  • Providing permanent fixes for the all break-failures items/issues as a part of production support.
  • Provided permanent fix for discount in VT, which lead to simplify entire VT discount process for PA/FA/AP project.
  • Expertise in debugging the Sql and PL/SQL to identify the issues and fix them so that the transaction are processed smoothly to GL through VT.
  • As a SME Ensured day to day operations/jobs are smooth and simplified entire VT process by fixing critical issue (eg: VT discounts,Asset Merge and progress payment), which lead to improve 800 hrs of IT productivity annually.
  • Fine tuned the VT process by creating error report when there are failures so that business team will notified.
  • Responsible for all activities related to the development/enhancements, testing and implementation.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Created request set and there incompability so that there would not be any conflicts between the requests/set when there are outage.
  • Closed 35+ VT external support request by working with VT external team, which lead to be stabilized the system within 6 months after the GO-LIVE.

Confidential

Assoicate - Projects

Responsibilities:

  • Involved in the ETL technical design discussions and prepared ETL high level technical design document.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Knowledge in Full Life Cycle development of Data Warehousing
  • Experience with dimensional modeling using star schema and snowflake models.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow
  • Extracted data from mainframe, high volume of data sets from data files, Oracle using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area.
  • Created complex Informatica mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator and Router transformations to extract, transform and loaded them into database.
  • Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Expertise in scheduling Informatica jobs using Informatica, Windows scheduler.
  • Fixing production issues and bug and working with the Business team for the enchancements.

We'd love your feedback!