Designer Resume
Foster City, CA
SUMMARY
- Big Data HADOOP (Cloudera Certified) professional with more than 9 years of experience in System Analysis, Design, Development, Administration, Maintenance and Production Support.
- Almost 2 years of hands - on experience in Big Data Hadoop in Enterprise mid-size cluster for one of the largest payment network companies.
- Strong skillset in Hadoop administration and development using MapReduce, Pig, Hive, Sqoop, Fair Scheduler, Flume and Oozie.
- Strong Unix shell scripting skills.
- Hands-on and Instrumental in data platform migration for ETL from AB Initio to Hadoop.
- Strong in Hadoop Cluster testing, validation after upgrade or patching.
- Strong exposure in Oracle.
TECHNICAL SKILLS
Big Data: Hadoop ( Cloudera 4.x.x and 5.x.x.), Hive, Pig, Sqoop, Oozie, Flume, Fair Scheduler, HUE,Nagios,Ganglia
Databases: Oracle, DB2
Development Tools: MapReduce, HQL, PIG LATIN, PL/SQL, JAVA, C, C++
Scripting: Unix Shell Scripting
ETL Tool: Ab Initio
PROFESSIONAL EXPERIENCE
Confidential, Foster City, CA
Designer
Environment: Hadoop 2.0.0-cdh4.2.1 / cdh 5.1.2, Cloudera Manager 4.8 / 5, Hive 0.10/ 0.12, Pig 0.10 / 0.12, Oozie, Flume, RHEL 5.9, 6.4, Abinitio, AIX
Responsibilities:
- Working as a Designer to guide development team for Hadoop best practices for any new deployment and also for migration of Abinitio to Hadoop.
- Performance tuning of Hive and Pig jobs as required.
- Did POC of sqoop export from HDFS to DB2 successfully to avoid Abinitio load for huge data.
- Address any hadoop job failure immediately reported by apps Support team and resolve within SLA.
- Instrumental in Data fix processes in hadoop platform, Monitor Alerts e.g. memory exception, data partition missing and health of the cluster through Cloudera manager.
- Validate quality of data getting loaded into Hadoop from different sources on adhoc basis.
- Analysis and Research using pig, hive, map-reduce for process improvements.
- Good exposure in restarting services (faced several times in Hive and Flume) as required.
- Working in tandem with core Linux administration team for upgrades and maintenance of the cluster.
- Working with Linux admin team regarding sizing of cluster (master nodes and number of data nodes).
- Reviewing application code/UDF’s to make sure that it does not break things in the cluster.
- Single handedly working to develop and administer in-house decision science repository for predictive analysis.
- Preparing weekly management status report.
- Currently involved in CDH upgrade from CDH 4 to CDH 5 for all the Hadoop Clusters
- Worked as a technical lead to implement the first Big Data Hadoop POC for RS Competency as per customer requirement.
- Trained more than 30 resources in RS including application SME’s who are deployed as Hadoop Developer, Support, Tester till date,
Confidential
Project Lead
Environment: Unix Shell Scripting, Oracle Apps 11i, Oracle 10g, Toad, Sun Solaris, AIX, SSH, PGP, Report 6i, Apps Modules- AP, PO,GL and iExpense
Responsibilities:
- Single resource who worked as core developer to fix more than 180 codes impacted due to infrastructure upgrade.
- Designed a Scanner tool for custom objects impact due to technical upgrade.
- Managing the Production Support as a Level 3.
- Preparation of Matrix data and Defects tracker for the management and client status calls.
- Managing Knowledge Management Repository.
- Managing any infrastructure issue.
- Managing connectivity testing using ssh and pgp.
- Managing a team of 6 developers and 12 technical Support resources both from offshore and onshore.
- Requirement gathering, design and code development.
- Co-ordination with DBA(s) for any technical support activity.
- Worked on preparation of unit Test scripts.
Confidential
Developer
Environment: Oracle, C, C++ and UNIX Shell Script
Responsibilities:
- Developed Custom Interface for uploading the data from IPIG Instrument into Oracle database.
- Provided technical assistance to other team members.
- Prepared technical design documents for customizations and developments..
Confidential
Team Member
Environment: Oracle, C, C++ and UNIX Shell Script
Responsibilities:
- Developed Customer Interface for controlling speed of Turbine and data flow control.
- Work Documentation.
Confidential
Team Member
Environment: C and Unix
Responsibilities:
- Testing for the code of the interface designed for Apple Sorting Machine.
- Documentation.