We provide IT Staff Augmentation Services!

Software Engineer Resume

2.00/5 (Submit Your Rating)

TECHNICAL SKILLS:

Hadoop eco system: Map Reduce, Hive, Pig, Kafka, YARN, HDFS, Scoop, Spark, Scala

Languages and Libraries: Java, Python, R, Oracle SQL, PL/SQL, HTML, Pandas, OpenCV, numpy

Databases: MS SQL Server, GemFireXD, Oracle 11g, MYSQL

Tools: SQL Developer, Informatica, BitBucket, Apache Thrift, Protocol Buffer, LIBSVM, Weka

Machine Learning: Linear Regression, Classification, Clustering

PROFESSIONAL EXPERIENCE:

Software Engineer

Confidential

Responsibilities:

  • Implement data ingestion (RDBMS to HDFS) in Confidential platform.
  • Testing Confidential tools on different Hadoop Distribution.
  • Handled storage and built components for transforming the data.
  • Worked with Agile methodologies for design and development.

Technologies used: Map Reduce, Sqoop, MySQL, Horton Works distribution, Cloudera Distribution.

Confidential

Software Engineer

Responsibilities:

  • Focused on large volume data transmission and migration GemFire XD from Oracle and Teradata.
  • Data Modelling, Data Loading and Synchronization was performed using ETL tool - Informatica.
  • Transformations on data was performed during data loading to ensure successful data loading.
  • Data integrity was maintained with concurrent transmissions from different data source systems.
  • Recognized by the client and awarded me the Kaizen certificate for innovation.

Tools: Informatica Power Center, GemFireXD, Oracle 11g, Teradata

Confidential

Software Developer

Responsibilities:

  • Implemented highly efficient data transformation (ETL) from 7 different Oracle and Teradata sources. Reduced time by 80%.
  • Complex SQL queries for eliminating unnecessary data at the time of data loading. Bottle-neck was resolved.
  • Worked on Informatica to optimize session run-time and reduce data traffic across the network (transfer of millions of rows reduced to thousands).
  • Concurrent data transformation from all 7 instances.
  • Data Integrity was maintained due to the custom roll back feature that was implemented with parallel Informatica sessions running on different mappings.
  • Handled large data sets and provided quicker and faster solutions in the data transformation lifecycle.
  • Worked well in Agile and malleable situation, and adept in SDLC lifecycle.
  • Result: - 19 times faster performance and Customer Satisfaction.

Tools: Informatica Power Center, GemFire XD, Oracle 11g, Teradata, Oracle SQL Developer, Teradata Manager.

Intern, Software Developer

Confidential

Responsibilities:

  • Worked on lossless compressing JPEG images for feature extraction using self-organizing maps to cater for the company’s presence in e-learning platform.
  • Used Kohonen’s self-organizing maps to reduce the brightness and data stored per pixel.
  • It involves reducing the features of an image below the limit a human eye could catch.
  • Resulting in lossless compression.

We'd love your feedback!