We provide IT Staff Augmentation Services!

Lead Software Engineer Resume

4.00/5 (Submit Your Rating)

TECHNICAL SKILLS:

Operating Systems: Windows and UNIX/Linux

Programming/Scripting: Python, R, Scala, Java/JEE, C, C++,, XML, Perl, PHP, JSON, HQL, SQL, PL/SQL Database Utilities Toad, SQL developer, Artisan

Job Scheduling Tools: Control - M, Autosys

Databases/RDBMS: Oracle 12c & 12gr2, SQL Server, DynamoDB

Cloud Administration: AWS (EC2, S3, EMR, RDS, IAM, VPC, REDSHIFT)

GUI/Tools: Eclipse, Jdeveloper Oracle R12

Reporting Tools: SSRS, OBIEE, Crystal Reports

Statistics and Probability: Natural Language Processing, Supervised and Unsupervised Learning, Multivariate Statistics and Analysis, Probability Distribution, Decision Trees, Monte Carlo Simulation, Bayesian Networks, Finite Element Analysis and more

Big data: Hadoop, MapReduce, Hive, Pig, Mahout, Flume, Sqoop, Ambari, Kafka, Zookeeper, Oozie, Hortonworks HDP, Cloudera CDH, Spark

Testing: Informatica TDM, HP QC

PROFESSIONAL EXPERIENCE:

Confidential

Lead Software Engineer

Responsibilities:

  • Project management from technical deliveries and extensive coordination with offshore team and deliveries on schedule time with quality.
  • Communicated project status, including project issues up and down the management chain, including senior management.
  • Communicate the project design to other application designers and team members
  • Managed change requests, including assisting the sponsor and stakeholders to understand the impact of the change on schedule or features
  • Responsible to resolve design issues and develop strategies to make ongoing improvements that support system flexibility, performance and metrics reporting.
  • Involved implementation of migrating RDBMS Oracle data to AWS RDS and EMR
  • Conduct systems designs, feasibility study and recommend cost effective solution in Cloud
  • Provide Big Data architectural services to clients - Apache Hadoop, Hive SQL, HBase, Fume, Sqoop, Zookeeper, ElasticSearch, Impala and Ambari
  • Implementation of spring core J2EE patterns like MVC, Dependency Injection (DI), and Inversion of Control (IOC).
  • Implemented REST Web Services with Jersey API to deal with customer requests.
  • Developed test cases using J Unit and used Log4j as the logging framework.
  • Developed user interface using HTML, Spring Tags, JavaScript, J Query and CSS.
  • Developed the application using Eclipse IDE and worked under Agile Environment.
  • Design and implementation of front end web pages using CSS, JSP, HTML, javaScript Ajax and, Struts
  • Utilized Eclipse IDE as improvement environment to plan, create and convey Spring segments on Web Logic
  • Work cooperatively with others and take necessary steps to ensure successful project execution using strong verbal communication skills.
  • Responsible for release level tasks and involvements in ‘Lessons Learned’ meetings and applied these ideas to improve processes for new projects
  • Design validation strategies and review Test Cases, Test Plans prepared by testing team
  • Handled multiple projects simultaneously and communicate requirements and status effectively
  • Delivered results in assigned timeframes that were sometimes quite short
  • Active involvement in the entire Software Development Life Cycle including Design, Development, Testing, Deployment and Support for Software Systems

Confidential

Senior Software Engineer

Responsibilities:

  • Implemented Spark RDD transformations to map business analysis and apply actions on top of transformations.
  • Developed Scala scripts, UDF's using both Data frames/SQL and RDD/MapReduce in Spark for Data Aggregation, queries and writing data back into RDBMS through Sqoop.
  • Over 3+ years of design and development experience in Big Data Hadoop technologies like building, loading and analysing data on Spark, HDFS & YARN Cluster, MapReduce.
  • Involved in Sqoop, HDFS Put or Copy from Local to ingest data and Map Reduce jobs.
  • Extraordinary Understanding of Hadoop building and Hands on involvement with Hadoop segments such as Job Tracker, Task Tracker, Name Node, Data Node and HDFS Framework.
  • Deploy Spark Application Using SBT tools.
  • Extensive knowledge on NoSQL databases like HBase, Cassandra, Mongo DB.
  • Involved in loading data from UNIX file system to HDFS. Installed and configured Hive and written Hive UDFs. And Cluster coordination services through Zoo Keeper.
  • Conduct Big Data Ecosystem tools scenario-based capacity planning and predictive analytics of resource utilization to accommodate growth using BMC Capacity Optimizer
  • Implemented Spark and utilizing Data frames and Spark SQL API for faster processing of data.
  • Used Spark for interactive queries, processing of streaming data and integration with popular NoSQL database for huge volume of data
  • Extensively use classes, objects, traits, case classes, functions for writing business logic in scala.
  • Hands on working experience on using powerful Scala collection framework.
  • Creating reusable transformation (joiner, sorter, aggregator, lookup, router) in Informatica Power center
  • Working with BI Environment for report preparation (Physical, business and presentation layer)
  • Using Control-M to manage all the schedule jobs by using shell scripts
  • Extensive worked on operations support for production environment.

Confidential

ETL Developer

Responsibilities:

  • Manage and developing the ETL process for data and transactional Data.
  • Preparing Data Process system and configure the job process including Loading from external mainframe source.
  • Extensive use of DBMS package specially DBMS PARALLEL EXECUTE for running the jobs in parallel fashion for dumping and other process for large objects.
  • Working with direct path method load with partitioning for rapid movement of data from one place to another location.
  • Extensive Use of Recursive query for pattern matching for Huge Location data expansion
  • Analyze the execution plan, trace file, ADDR Report for process optimization specially handling very large data movement
  • Creating ETL workflow in SSIS to blow up the data (master and transaction) to BI(SSRS) for user define report preparation
  • Working with BI(SSRS) Environment for report preparation (Physical, business and presentation layer)
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application.
  • Creating Shell Scripts and SQL scripts files for AUTOSYS.
  • Using BULK COLLECT FORALL and tuned the VL data.
  • Creating database objects and Developing and modifying PL/SQL packages, functions, procedures, triggers.
  • Created stored Procedures using EXECUTE IMMEDIATE (Native Dynamic SQL) and REF CURSORS.
  • Oracle Job Scheduling using DBMS JOB and DBMS SCHEDULER.
  • Responsible for developing and maintaining very large data object and propose proper solution for handling huge data.

Confidential

Software Engineer

Responsibilities:

  • Involved in gathering Business Requirements by attending joint session and Interacting with Business Users.
  • Performed Gap Analysis, Workflow Analysis, System Analysis - Created Process Flow documents, High Level Design documents.
  • Creating and modifying of Database tables, indexes, triggers, packages, functions and procedures etc.
  • Responsible for Database development, Database Designing, Query Optimization, Performance Tuning.
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application.
  • Created Materialized Views, Use BULK COLLECT and tuned the data.
  • Knowledge on handling QA tickets and analyzes trouble tickets.
  • Knowledge on system configuration for the database PL/SQL applications.
  • Loading of data from external systems using SQL*LOADER.
  • Developing customized Crystal Reports XI R2 using ORACLE database and SQL query based on the functional requirement from the client end.
  • Developed reports using- Customized SQL, Drill-down option, Grouping, Running Totals and Summary reports.

Confidential

Developer and Production support

Responsibilities:

  • Requirement collection for ESA, ODS and EDW system from the business analyst.
  • Data modeling for ESA, ODS and EDW
  • Design dimensional logical data modeling in OWB tools for analysis purpose
  • Use star and snowflake schemas in data warehouse environment
  • Satisfy the other team like SOA, WEB and ETL with the data model.
  • Identify the data loading strategy
  • Developing custom Reports using Oracle report tools (RDF).
  • Using SQL* Loader and external tables for Data migration from other systems regularly.
  • Design the PL/SQL process flow for ODS and EDW.
  • Develop the engine using oracle advance PL/SQL (Bulk collection, Ref cursor, table function etc) to load the data in ODS and EDW in a Multilanguage environment.
  • Develop advance level SQL using Analytical Function for the other teams like SOA, WEB etc
  • Java Classes as an external routine in this project for building processes

Confidential

Senior ERP Analyst

Responsibilities:

  • Creating ETL workflow in SSIS to blow up the data (master and transaction) to BI(SSRS) for user define report preparation
  • Working with BI(SSRS) Environment for report preparation (Physical, business and presentation layer)
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application.
  • Creating Shell Scripts and SQL scripts files for AUTOSYS.
  • Using BULK COLLECT FORALL and tuned the VL data.
  • Creating database objects and Developing and modifying PL/SQL packages, functions, procedures, triggers.
  • Created stored Procedures using EXECUTE IMMEDIATE (Native Dynamic SQL) and REF CURSORS.
  • Oracle Job Scheduling using DBMS JOB and DBMS SCHEDULER.
  • Responsible for developing and maintaining very large data object and propose proper solution for handling huge data.

Confidential

Senior ERP Developer engineer

Responsibilities:

  • Extensive involved in making backend data processing using oracle 10g to help the system smooth running.
  • Review the PL/SQL codes from the point of data accuracy and performance before deploying those codes in production.
  • Handling advanced level PL/SQL techniques (i.e. Bulk collection, REF cursor, table function)
  • Advanced level SQL development using analytic functions and Query Optimization
  • Data movement by using data pump
  • Ensured that project compliance to the organizational QA standards QA testing, participated in the phase-end and in-phase quality audits of the project and ensured all deliverables were on time, IT Security and Controls (Authentication & Authorization)

We'd love your feedback!