We provide IT Staff Augmentation Services!

Technical Project Manager Resume

3.00/5 (Submit Your Rating)

US

SUMMARY

  • 10 years of experience in health care / long term care billing system on Java/J2EE and Mainframe application, 4 years of working experience in Designing and Building high performance and various Hadoop Stack scalable systems using Big Data Ecosystem on windows and Linux environments and Java.
  • 4+year of strong end - to-end experience on Hadoop Development wif varying level of expertise around different BIGDATA Hadoop projects. Strong hands-on experience on Big Data technologies Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cassandra,Cloudera Manager,Spark, Scala, Scala,Pig, Sqoop, Oozie, ZooKeeper, Teradata, PL/SQL, MySQL, Windows, Horton works, Oozie, HBase.
  • Extensive experience in developing & designing ETL by using BTEQ, Data Stage, TPT, SQL, BTEQ, MLoad, Fastload, Fast Export, Stored procedures
  • Designed & Developed reusable and generic "Application Metric Collection and Query API" using REST, CDAP, Hadoop frameworks, Strong Programming Skills in designing and implementation of multi-tier applications using Java, J2EE, JDBC, JSP, JSTL, HTML, JSF, Struts, JavaScript, Servlets, JavaBeans, CSS, EJB, XSLT, JAXB.Experience in developing Web Services using - SOAP, WSDL and developing DTDs, XSD schemas for XML (parsing, processing, and design).
  • Currently leading team of 8 people. Its Onsite offshore delivery model.
  • Development life cycle, software development methodology, and practices.

TECHNICAL SKILLS

Operating Systems: MVS (Z/OS), HP-UNIX, Sun Solaris, AIX, Linux, Windows (Vista, XP, 2000, NT 4.0, 95), DOS and Apple Max 10.X.

Big Data Technologies: Apache Hadoop, Cassandra,Cloudera, CDAP,HIVE, PIG, KAFKA, SQOOP, OOZIE, ZOOKEEPER, YARN, HBASE, HDFS, NOSQL,Pivotal,Hawk,MAPR,STROM.SPARK

Metadata/Tools/API: Tableau, ElasticSearch

Java/J2EE Technologies: Servlets, JSP (EL, JSTL, Custom Tags), JSF, Apache Struts, Junit, Hibernate 3.x, Log4J Java Beans, EJB 2.0/3.0, JDBC, RMI, JMS, JNDI.

Web Technologies: XML, XSL, XSLT, SAX, DOM, CSS, Java Script, HTML, AJAX, GUI, Web services (SOAP, WSDL, Axis) Apache POI, iText, JBoss SEAM,BIZTALK

Mainframe Technologies: JCL, COBOL, RPG, SAS and SQL, MQSeries, VSAM,CICS

Mainframe Tools: File Manager, MAX, SCLM, SORT, CONNECT DIRECT, PDSMAN, EZYEDIT, INTERTEST (online), Rational Suite PACBAS, ENDEVOR

Database: Oracle,MongoDB, Sybase, Hyperion, IBM DB2,IMS, MYSQL, Teradata

Programming/Scripting: Python, SAS,Java, C, C++, Shell scripting.

Methodologies: UML, OOAD, RUP, Waterfall model, Agile.

PROFESSIONAL EXPERIENCE

Confidential, US

Environment: Hadoop, MapReduce,MAPR HDFS, Hive, Java, SQL, Cloudera Manager,CDAP, Scala, Pig, Sqoop, Oozie, ZooKeeper, Kafaka,Teradata, PL/SQL, MySQL, Windows, Horton works, Oozie, HBase.

Technical project Manager

Responsibility

  • Worked wif Data Ingestion techniques to move data from various sources to HDFS.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries dat halped market analysts spot emerging trends by comparing fresh data wif EDW reference tables and historical metrics.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Analyzed different formats of Data in Scala.
  • Worked on writing Map reduce programs using Java.
  • Extensively worked wif Partitions, Bucketing tables in Hive and designed both Managed and External table.
  • Worked on optimization of Hive Queries.
  • Create Tableue folders, global sources, global targets, global ODBC connections, global FTP connections and schedule workflows, data profiling and manage metadata.
  • Involved of in developing Pig UDFs for the needed functionality dat is not out of the box available from. Collected, analyzed, monitored structured and unstructured data using ELK ( Elastic Search)
  • Created and worked wif Sqoop jobs wif full refresh and incremental load to populate Hive External tables.
  • Worked on Pig to do data transformations.
  • Responsible for managing and reviewing Hadoop log files.
  • Developed UDF's in Map/reduce, Hive and Pig.
  • Designed applications for storing data to HDFS by using Kafka to get more performance.
  • Done the Spark and Kafka integration to get the data from Kafka events to Spark input.
  • Designed RDDs wif Spark Streaming and Spark SQLs
  • Presently implementing KAFKA, Strom for streaming the data from one of the sources.
  • Developed a data pipeline using Kafka and Strom to store data into HDFS.
  • Worked on Hbase and Its integration wif Strom.
  • Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and Sqoop jobs.
  • Worked wif RDBMS import and export to HDFS.
  • Involved in requirement analysis.
  • Involved in giving KT to other team members.
  • Involved in preparing Project documentation.

Confidential, US

Environment: Java/J2ee, WSS, JSP, Servlets, Struts, SOAP COBOL, PYTHON, RPG, JCL, VSAM, DB2, CICS.Role

Technical project Manager

Responsibility

  • Involved in requirement analysis, functional specifications and over-all component design.
  • Worked on optimization of Hive Queries.
  • Created and worked wif Sqoop jobs wif full refresh and incremental load to populate Hive External tables.
  • Worked on Pig to do data transformations.
  • Developed UDF's in Map/reduce, Hive and Pig.
  • Developed Modules using Core Java, JSP, HTML and strut MVC.
  • Design and Development of UI using HTML5, CSS, JSP, JavaScript and Ajax.
  • Used Hibernate framework for the backend persistence.
  • Implemented and consumed REST web services.
  • Implemented Design patterns like Factory, Singleton, and Decorator.
  • Created SQL queries and Stored Procedures for CRUD (Create, Read, Update and Delete) operations on database.
  • Implemented Multithreading and collections in java code.
  • Implemented utility programs and coding of Java classes.
  • Performed Junit testing to test the functionality by writing Junit test cases.
  • Worked in Agile environment for quick and phased delivery of code.
  • Participating Agile Scrum meetings regularly.
  • Involving Production Deployment Builds.
  • Used Eclipse3.6 IDE tool for developing code modules in the development environment.
  • Implemented the logging mechanism using log4j framework.

Confidential

Environment: JAVA, COBOL, JCL, VSAM, DB2, IMS, CICS.Role

Sr. Software Engineer

Responsibility

  • Assist and troubleshoot current Mainframe environment.
  • Analyze designs, develop and execute custom application features and functions.
  • Develop apt standards and controls to ensure every system’s projects function TEMPeffectively Design, code, test, debug, and record those programs proficient to work at top technical level of every level of applications programming activities.
  • Evaluate, design, code and test improvement as required to complex modules.
  • Develop system parameters and interact for complicated components.
  • Design and develop code applications to technical and functional programming standards.
  • Provide prime assistance toward application releases installation into production under given direction.
  • Coordinate and involve in structured peer reviews and walkthroughs.
  • Plan and implement every needed process steps as defined in methodologies.
  • Develop operational documents for Application.

We'd love your feedback!