We provide IT Staff Augmentation Services!

Big Data/hadoop Poc Project Resume

SUMMARY

  • Over 10 years of Hands - on experience in Data warehousing for ETL tools Data stage & Informatica with Oracle 11g/10g & DB2. Involved in every phase of the system Development lifecycle, including feasibility studies, gathering user and business requirements, analysis, design, development, testing and management for medium and large data warehouse projects in domains such as Sales, Insurance, Retail, and Manufacturing.
  • Application Operation Support for Corporate data warehouse projects
  • Excellent experience with Hadoop Clusters set up
  • Big-Data Hadoop Administrator/Developer certified in HTC Global Service Ltd
  • ITIL Foundation V3 certified in ATOS (Siemens)

TECHNICAL SKILLS

Big Data Tools: Storm Streaming Process, LogStash(Shipper, Indexer), Redis/RabbitMQ server, Elastic Search, Cassandra, Hbase, Hive, Pig, Kibana Visualization

ETL Tools: Data stage 7.5.2 & 7.1, Informatica 6.1 & 8.6

Database: DB2, ORACLE 10g/9.2, PL/SQL

GUI & Tools: Visual Basic, Crystal Reports, TOAD, SQL Developer

Operating System: Windows XP/NT/2000/98,UNIX,LINUX

PROFESSIONAL EXPERIENCE

Confidential

BIG DATA/HADOOP POC PROJECT

Responsibilities:

  • Design the system architect of multiple internal ATOS log systems.
  • Gather the complete business requirement about all of the logs (syslog, Apache log, Log4 etc).
  • Manage SQL query language is often used as the interface to a hadoop and datawarehouse.
  • Pig using for scripting language used for data flows.
  • Manage and coordinate the team to develop as per the business requirement.
  • Using Hbase NoSQL database runs on top of hadoop as a distributed and scalable big data store.
  • Prepare the documentation as per the requirement process.
  • Execute and advise on the optimal solution implementation.

Tool: Used for POCs Log Analysis:

Confidential

Consultant

Responsibilities:

  • Developed the mapping & workflow as per the customer’s new request.
  • Monitored the daily running jobs through OPC/TWS scheduler tool.
  • Handled the aborted job while loading the data into data warehouse.
  • Monitored the monthly running jobs in order to avoid the delay of data loading into data warehouse.
  • Tested the developed mapping and workflow in development environment.
  • Handled the monthly task for LBASE in order to validate the customer DHL data and our table data.
  • Assisted new team members in coming up to speed and in building their skills in Informatica and DB2.
  • Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 8.6.1, DB2, QMF tool, OPC/TWS Job Scheduler, UNIX and Linux.

Confidential

Associate Consultant

Responsibilities:

  • Prepared daily status report for data loading status to the customer.
  • Used FTP to transfer the files between the different UNIX boxes.
  • Manually spooled out data file from MQ series queue and transferred the source data file SAP system to UNIX system.
  • Solved data source issue for CSV or flat file; In order to load the data, manually correct the source data and load into DWH system.
  • Provided weekly status reporting for all sub projects, infrastructure issues, and data file delivery on UNIX.
  • Created the Correction Characteristics report for New/Change requests.
  • Assisted new team members to build their skills in Data Stage and Oracle.
  • Prepared documents as new requirements developed from offshore team.
  • Provided the data extract as per user request for analyzing on weekly & monthly reports.

Environment: Data stage 7.5.2, Oracle 10g, UC4-Job Scheduler and Windows XP.

Confidential

Sr. Software Engineer

Responsibilities:

  • Prepared daily status report for data loading status to onsite/customer.
  • Manually extracted data from SAP source system in order to load the data into DWH.
  • Maintained the weekly logbook to cover all issues.
  • Involved in Trafo business logic and flow of data loading into data warehouse
  • Solved and debugged blocked jobs in UC4 & Data Stage; thereafter reporting Error Character tics (analysis result) to the customer.
  • Involved in production support issues.
  • Involved in manual data loading into Integration Environment during Environment (I/P Switch) change before backup days.
  • Involved in manual spool out data file from MQ series queue and transfer the source data file SAP system to UNIX system.
  • On the database side, created and replaced the view, Rebuilt Index and partition, etc

Environment: Data stage 7.5.2, Oracle 9.2/10g, UC4-Job Scheduler and Windows XP.

Confidential

Software Engineer

Responsibilities:

  • Involved in production server monitoring for daily data loading status and any jobs blocking state; analysis and debugging of blocked job in scheduler and sending analysis results to the customer.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.
  • Involved in data source issue for CSV or flat file; In order to load the data, manually correct the source data and loaded into DWH system.
  • Involved in customer requirement and prepare the documents according to complete the request and send Correction Character tics to the customer.
  • Involved in Environment (I/P Switch) change, start manually loading the data into Integration (test) Environment so data loading provision for both environments are equal and switch can transfer to single Environment.
  • In database created the View, Replace View; Rebuild Index & partition, etc.
  • Involved in validating the data for both P/I environments; if any discrepancies, delete the bad data and reload into DWH system.

Environment: Data stage 7.1, Oracle 9i, UC4-Job Scheduler and Windows NT.

Confidential

Software Engineer

Responsibilities:

  • Identified the significant attributes to extract through thorough understanding of client database.
  • Created Repository in Informatica 6.1
  • Designed mappings in Informatica for the loading of the data in the Data warehouse from the system.
  • Extensively involved in the extraction of different flat files and Oracle OLTP system.
  • Created complex mappings/mapplets using expressions, aggregators, joiners, ranking, filters, look up transformation in Informatica Power Mart.
  • Developed and Deployed Informatica ETL components
  • Developed and tested ETL processes and components using Informatica
  • Performed maintenance, troubleshooting and fine tuning of existing ETL processes
  • Designed, Developed and Tested Mappings that migrated data from the legacy sources to the ADS (warehouse).
  • Involved in Unit Testing, Performance tuning & Functional Testing activities.

Environment: Informatica 6.1, Oracle 9i, SQL Navigator, HTML, Windows NT, MS-Excel.

Confidential

Programmer

Responsibilities:

  • Designed logical and physical database model using case tools
  • Wrote programs in Visual Basic 6.0 using SQL Server database
  • Generated reports, which helped management tightens control over purchases
  • Development of Triggers, Stored Procedures and PL/SQL
  • Performed System and Integration testing, supported QA testing.
  • Requirements analysis and preparation of specification document
  • Involved in requirements gathering for GUI screens

Environment: Visual Basic 6.0, Oracle, ADO, SQL Server 6.5, Windows 98.

Hire Now