We provide IT Staff Augmentation Services!

Developer Resume

2.00/5 (Submit Your Rating)

Tampa, FloridA

EXPERIENCE SUMMARY:

  • 9+ years of IT experience in Development, Delivery and Support.
  • Main technological fortes are Hadoop, Spark and SAS, Java Spring Boot.
  • Have got good exposure to Insurance, Retail and Banking domains.
  • Good understanding of Software Development Lifecycle.

TECHNICAL SKILLS:

  • Hadoop(MapReduce, SQOOP, Hive, Pig, PostGreSQL,Oozie)
  • Scala
  • Kibana
  • ElasticSearch
  • LogStash
  • ControlM Scheduler
  • Spark
  • SAS
  • PL/SQL
  • UNIX, Unix Shell Scripting.
  • Autosys, JIL scripting
  • RM scheduler
  • Java
  • Spring Boot

CAREER PROFILE:

Confidential, Tampa, Florida

Developer

Responsibilities:

  • Develop Java Spring boot spark code to Copy data from Hadoop Hive tables to Snowflake.
  • Develop Java Spring boot spark application to replace AbInitio functionality.
  • Develop Spark code to source data from multiple RDBMS(Oracle, Netezza) and load to Hive and Snowflake databases.
  • Copy Recon data from kafka to Elastic using Logstash and Spark.
  • Create Kibana dashboard for Citi Recon data.
  • Create Kibana dashboard to measure performance of Apache Storm on Recon process.

Technologies: Hadoop(HDFS, Apache Hive, Apache SQOOP), Shell script, Spark Batch, Scala, Java, Spring Boot.

Confidential

Developer

Responsibilities:

  • Develop Java parser for SAS files.
  • Develop and Test Custom Input Record Reader for SAS files Using Scala programming.
  • Integrate and deploy Spark Based Scala Replication Tool for SAS files.
  • Replicate Data from Multiple sources to EDM platform.
  • Create Hive Views.
  • Configure Hive views to Unica for Campaign Lunches.
  • Create Kinaba Dash Boards.

Technologies: Hadoop(HDFS, Apache Hive, Apache SQOOP, Apache Oozie), Shell script, Spark Streaming, Scala, Datameer, Java.

Special Software: Putty, Unix, SQL Developer, Contrl - M Scheduler, Eclipse.

Confidential, Arkansas

Developer - IT

Responsibilities:

  • Load competitor customer detail and product pricing data from different location (FTP, Teradata, Oracle) to Hadoop environment.
  • Copy data using SQOOP from various sources like Oracle, Teradata, Infomix for comparison/reference with Wal-Mart data.
  • Process high volume of raw data using Apache Pig and move to Hive warehouse location.
  • Compare other retails cost with Wal-Mart pricing cost and then load to Green database for Analytics.
  • Generate reports on the processed/compared product details and sent to different department for suggesting price change accordingly.

Languages: Hadoop(HDFS, Map Reduce, Apache Pig, Apache Hive, Apache SQOOP). PostGreSQL.

Special Software: PG Admin, Unix, SQL Assistant, RM Scheduler.

Confidential, Hartford, Connecticut

Developer

Responsibilities:

  • Developing SQL and SAS codes macros as per the SRD document.
  • Creating reports (Excel, Pivot or PDF files etc) and SAS Cubes as per the specifications.
  • Doing unit testing & System Integration testing for the developed reports.
  • Once the reports are fine, scheduling in UNIX environment using Shell and JIL scripting.
  • Validating reports data as per the requirements basis.
  • Automating report sending process (RAFT, FTP or Web etc).
  • Handling some Ad-hoc reporting apart from the regular development activities.

Languages: SAS, Unix, Unix Shell Scripting, SQL, PL/SQL.

Special Software: SAS EG, SAS OLAP Cubes, Autosys, JIL scripting.

We'd love your feedback!