We provide IT Staff Augmentation Services!

Software Developer Resume

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Total Experience of 12+ years in IT industry in the areas of analysis, design, development, testing, maintenance and implementation of application software.
  • 4+ years of experience in Designing, Developing and Deploying Hadoop ecosystem components / applications on premise and on cloud.
  • 2+ years of experience in Integration and Reporting services. Worked with SSIS and Dot Net Report Viewer to generate graphical reports.
  • 3 years of experience in Analysis, Design, Coding, Testing, Production support, experience in C#, ASP.NET, VB.NET, ADO.NET, win forms, web forms, web services, Visual Studio, IIS, Java Script, Ajax, SQL SERVER 2000/2005/2008, Windows environments.
  • 3 years of working on Cloud Computing (Azure and AWS) projects.
  • Pursuing Microsoft Azure Data fundamentals and solution architect .
  • Pursuing Confluent Kafka
  • 1+ years of working on Talend ETL Big Data Tool with Big Data.
  • Working experience with Dev Ops (CI - CD). Developed “Build and Release Pipelines” in Azure (VSTS) for Big data Projects.
  • Working experience in SSIS & Reporting Modules (Dot Net Report Viewer).
  • Practical experience operating in an Agiledevelopment environment. Tool used: VSTS, Rally, Version One.
  • Code versioning, build and ticketing tools used BIT bucket, GIT, VSTS, Azure repo, Jenkins, JIRA.
  • Worked on real time data integration usingConfluent Kafka.
  • Worked on Domains Banking (Wholesale lending), Retail and Transportation, Health Care.
  • Worked on Pre - sales and Marketing.
  • Excellent technical problem solving and code debugging skills.
  • Good working knowledge SDLC and Agile software methodologies.
  • Aggressive deadline oriented.
  • Supported code/design analysis, strategy development and project planning.

TECHNICAL SKILLS:

Big Data Ecosystems: Hadoop, HDFS, MapReduce, Hive, Pig, Spark, Apache

Sqoop, Syncsort DMX: H, Impala, Kafka, NIFI.

Scheduling: Apache Oozie, Autosys, and TAC.

Programming Languages: C/C++, Java, Scala, C#.NET, VB.NET, PL/SQL and XML.

Operating Systems: WINDOWS 95/98/NT/2000/XP/7, Cent OS. RHEL.

Technologies: ADO.NET, ASP.NET.

Integration Tools: SSIS, Dot Net Report Viewer, Talend.

Version Control: Team Foundation Server 2010, Visual Source Safe 6.0, SVN.

Database: MSSQL Server 2000, 2005 and 2008, MS-Access, Oracle.

Hadoop Distribution: Cloudera, Horton Works.

IDE: Visual Studio 6.0, Visual Studio.NET 2003/05/08/10, Eclipse, IntelliJ.

ETL Tools: DataStage 7.5V, Talend.

Cloud Computing: Azure, AWS.

Hadoop Distribution: Cloudera, Hortonworks.

PROFESSIONAL EXPERIENCE:

Confidential

Responsibilities:

  • Implemented Spring boot microservices to process the messages into the Confidential cluster setup.
  • Worked as Tech lead to gather business requirements and guided the team on timely fashion.
  • Closely worked with Confidential Admin team to set up Confidential cluster setup on the Dev, QA and Production environments.
  • Worked on multiple Source and Sink Confidential connectors (Snowflake Sink connector, Salesforce Source connector, Oracle source connector, Postgres Sink connector, S3 sink connector).
  • Worked directly with Confluent Support team for all the issues faced.
  • Build multiple utilities to speed up the project.
  • Deployed applications on Openshift.

Confidential

Responsibilities:

  • Designed a reusable framework on cloud-based platform (AWS) using PySpark that serves as a common platform for both the modules.
  • Designed and build the Data layers until consumption.
  • Designed and build a robust Audit Balance Control (ABC) framework.
  • Enhanced User experience to input details in a CSV file for any modifications without any knowledge of Hadoop using reusable framework.
  • Improved processing time by parallelizing job executions through controlled data pipelines.
  • Provided data for Reporting which is build using Spot Fire.
  • Work with key stakeholders to develop and articulate proposed solutions.
  • Defined the Code versioning and repository structure.

Confidential

Responsibilities:

  • Designed and build the Data layers from Data Ingestion until consumption.
  • Build the solution on Azure Cloud platform.
  • Integrated Talend with big data components such as Sqoop, Spark.
  • Provided data for Analytics Engine build on R / Python.
  • Been a Scrum master for Data Lake team. Maintained the work and delivery through VSTS - Agile board.
  • Ingested 56 source system of different data bases (SAP, SQL Server, Oracle, Flat file)
  • Work with key stakeholders to develop and articulate proposed solutions.
  • Defined the Code versioning and repository structure.
  • Defined the pipelines for continuous integration and deployments using Azure DevOps.

Confidential

Responsibilities:

  • Importing / exporting data from HDFS to RDBMS and vice-versa using SQOOP.
  • Responsible for analyzing and cleansing raw data by performing Hive queries and running Pig scripts on data.
  • Worked on Data integration and Data standardization.
  • Used Oozie to create workflows and sub-workflows.
  • Used Autosys to schedule jobs.
  • Worked on Cloudera Hadoop Distribution.
  • Agile Methodology is been implemented in the project. Used Version One tool to capture the artifacts.
  • Provided design recommendations and thoughts to stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.

Confidential

Responsibilities:

  • Involved in the Coding, fixing bugs, gathering information from Onsite.
  • Used Ajax, Dot Net Report viewer,, SSIS and CWL framework
  • Involved in resolving Bugs and participated actively in the change requests.
  • Wrote unit test cases for the module added to application.
  • Handled projects as Enterprise Credit risk information system and provided reports for Credit review deck users
  • Involved in Performance improvement and fine-tuning of SQL queries.

Environment: ASP.Net 2.0 / 4.0 AJAX, VB.Net, SQL Server 2005, JavaScript, SSIS, C#.Net, Dot Net Report viewer, TFS 2010.

Confidential

Responsibilities:

  • Involved in the Coding, fixing bugs, gathering information from Onsite and the Client.
  • Worked on ASP.NET and Oracle 9i.
  • Used Ajax and Web Services.
  • Used Windows Service for data population.
  • Involved in resolving Bugs and participated actively in the change requests.
  • Wrote unit test cases for the module added to application.
  • Impart to new team members and fresher.

Environment: ASP.Net 2.0 AJAX, VB.Net, Oracle 9i, HTML, JavaScript, VSS.

Confidential

Responsibilities:

  • Involved in a fair amount of coding in C#.Net for the algorithm Implementations.
  • Worked on ASP.NET and MS-SQL Server 2005 to build the different Business scenarios for the Speaker Identification/ Verification system.
  • Writing triggers, fixing bugs in code and taking weekly status calls with Client.
  • Involved in designing documents of the call flow and various protocols used.
  • Prepared Unit Test Cases.
  • Involved in migrating the Existing Database to a unified database.
  • Used Windows Service for running SIS Engine.

Environnent: ASP.Net 2.0, C#.Net, MSSQL 2005, HTML, DHTML, JavaScript, VSS

Confidential

Responsibilities:

  • Implemented the complete Algorithm in JAVA.
  • Prepared Documents (URD, DDD, Unit Testing, Algorithm pseudo code) of the implementation.
  • Interacted with Subject Matter Experts and called for meeting whenever Necessary.

Environnent: JAVA, MYSQL.

We'd love your feedback!