Hadoop Senior Developer Resume
MO
SUMMARY:
- Professional IT Experience of 13 years, including 5 years of work experience in Big Data Hadoop Architecture, design, development and Ecosystem Analytics in Retail, Healthcare Banking and media domains.
- Experience on major components in Hadoop Ecosystem like Spark, MapReduce, HDFS, HIVE, PIG, Sqoop, Oozie, Flume, Kafka, Storm,Hcatelog, Apache Drill and Solr.
- Experience in Nosql Databases like Hbase, Cassandra and Mongodb.
- Experience in Hadoop Distributions like Cloudera, Hortonworks, MapR and AWS Cloud EC2.
- Experience in Python, Scala and analytics tool R.
- 8 years of exp in Java,J2ee - Servlet,Jsp,Ejb2.0,Hibernate3.0,Struts1.2,Spring2.5 and Rdbms Oracle,Sqlserver as well Application servers (websphere7.0 and webLogic8.1).
- Good working knowledge on Distributed data processing systems
- Expertise in Hadoop Lambda Architecture
- Capable of Designing and Architecting Hadoop Applications and recommending the right solutions and technologies for the application
- Proficient in all Phases of SDLC (Analysis, Design, Development, Testing and Deployment) and gathering user requirements and converting them into software requirement specifications
- Work closely with Business clients
- Worked as liaison between the Customer and the Off-shore &On-shore team
- Excellent Analytical, Programming and Logical skills
- Capable of handling multiple projects & teams at the same time
- Good Experience as a Tech / Project Lead
TECHNICAL SKILLS:
Bigdata Stack: HDFS,MaprFS, Apache Spark(Core, Streaming & SQL), Kafka,Storm,MapReduce, YARN,Pig,Hive,Zookeeper, Sqoop, Flume, Oozie, Cloudera,Hortonworks,Mapr,AWSApache Drill, Apache Nifi.
Databases: Oracle, MSSqlserver and Mysql.
NoSql Databases: Hbase, Cassandra and Mongodb.
Languages: Java, Scala, Python,Javascript,Ajax,XML and R.
Serverside Languages: Servlet, Jsp and EJB.
Frameworks: Struts, Hibernate, Spring.
Web and App servers: Tomcat webserver,Websphere and weblogic
GUI Tools: Applet, Swing and SoapUI.
Operating Systems: Redhat Linux and Windows
Tools: /Components: Git,Ant,Maven,Solr,Cvs,Svn,Eclipse,RAD,Remedy,MSclarity,Sqldeveloper,ToadJIRA,Shell scripting and design patterns java GOF,J2ee.
Domains: Banking and financials, Retail, Healthcare, Telecom and Media & communication
PROFESSIONAL EXPERIENCE:
Confidential, MO
Hadoop Senior Developer
Devices: STB,DVR,VOD and IPTV
Responsibilities:
- Analysis the Existing system process for TWC,BHN and CHARTER.
- Have Prepared design documents for ACE process.
- Have done the coding for ACE and zombification using Scala and spark.
- Have developed Data quality framework to identify the fall out buckets.
- Have done the coding for 7 days reprocessing logic
- Leads quality assurance reviews, code inspections, and walkthroughs the offshore developers' code.
- Acts as technical interface to development team for external groups
- Have prepared validation script to check source and ACE enrichment.
- Have developed common Data frame utilities to save the data as ORC.
- Have loaded and Processed station and program data from tribune media station in ACE.
- Have created job and configure in Tidal scheduler.
- Have worked 70 Nodes physical cluster in hadoop2x.
Skills: Spark, Scala, Sqoop, Tidal scheduler, Hive, pig, Unix Script, Netezza, TerradataHDP, Oracle.
Confidential, CA
Bigdata Architect.
Responsibilities:
- Analysis the Existing system process.
- Have Prepared design documents for the above specified models.
- Have done the implementation for datapreparation,scoring and trend analysis.
- Have developed common export framework to transfer the data for different target systems(ET-Exa Target, PAS,Epiphony,Certona and Live ramp).
- Have Prepared the in-house Comparator tool using MapReduce for (Data Science and Engineering team output data validation)
- Leads quality assurance reviews, code inspections, and walkthroughs of the developers' code
- Acts as technical interface to development team for external groups
- Provide the training for team members and cross team members.
- Have prepared validation script to check source and target data validation post ingested.
- Have implemented scoring logic using R script and hive script.
- Have created and configure coordinator, workflow and bundles in oozie.
- Have deployed jar file in EC2 instance post development .
- Have worked 62 Nodes physical cluster in hadoop1x and 31 Nodes in hadoop2x Yarn.
- Have written transformation and action using Scala and Python with spark in hadoop2.
- Have worked 10 Nodes cluster in AWS for Dev & QA Environment.
- Have involved in AWS VPC security configuration.
- Have involved in setting up IAM identity access manager role.
- Have involved network set up in physical cluster with admin.
- Have worked in Erwin tool for data modelling.
Skills:MapReduce,Sqoop,oozie,Hive,pig,R,Scala,Python,Spark,Shell Scripting,Hortonworks.Talent Integration tool, Oracle(Online/Retail),MDS(marketing data source),Omniture(clickstream)
Confidential
Senior Hadoop Developer.
Responsibilities:
- Have created scheduler in oozie scheduled to run on everyday 10PM EST MR job.
- Have created sqoop action will export the data from Mapr-Fs to sql server temp table then trigger the Stored procedure will process the data from temp then send to main tables.
- Have created Email action will trigger the mail to business once job succeeded.
- Have created touchZ file to ensure for data has been exported successfully to target system.
- Have done benckmark in Real time process using kafka and storm
- Have done benchmark using Apache Spark with Scala.
- Have done benchmark using Apache Drill to BI access data from hbase.
- Have worked 20 Nodes physical cluster in hadoop1x.
- Have created collection in Solr. So UI team can access the data from indexed collection.
- Have created Solr and hbase bridge through java Api to sync the hbase data to solr collection.
Skills: Hadoop,MAPRFS,MapReduce,Hbase,Solr with lucid works work,sqoop,oozie,MapR(M5),Business-Object(BO),Tibco-JMS,Tibco-foresight,CEP(Complex-Event-Processing),Tibco-EMS Server,Tibco-BPM,MS Sql Server2012
Confidential, Boston
Tech Lead
Responsibilities:
- Have created DAO(Data Access Object-Design pattern) and data modeling in Cassandra1.2
- Have written common java thrift client which will connect with Cassandra and do the business use case .
- Have worked 8 Nodes cluster in AWS for Dev environment.
Skills: Hadoop1x,Hdfs,Mapreduce,Pig,Hive,Cassandra,Oracle,Cloudera,Sqoop,Cognos, Data pipeline and AWS cloud
Confidential
Responsibilities:
- Have created unix script copy the files(feed) from localFS to hdfs through oozie scheduler.
- Have developed UI for data migration from oracle 10g to mongodb through java Api. team.I have used Spring Api(bean - Mongo,MongoOperation,mongodbfactory,mongotemplate) export data from hdfs to mongodb and have involved design pattern DAO(Data Access Object) java driver -
- Have interacted with data warehouse team to cover the business use case whatever transformation has implemented through ETL corresponding transformation has applied using pig and hive for data analysis .
- Have created pig script and hive script implemented the business logic.
- Have created and configure coordinator, workflow and bundles in oozie.
Confidential
Senior System Analyst
Responsibilities:
- Involved in existing product stockholm to fix the change request(CR).
- Participated in design proposal review.Worked in design proposal(DP) for London and fixed the change requests (CR).
- Prepared Impact Analysis and UTP for CR.
- Performed Installation and Configuration.
Skills: Java2.0,JSP1.2,MS SQL server 2000/2005,JRUN 4, NFC Framework. Hibernate3.0, Tomcat 6.0, IIS6.0, MS Clarify, MS VSS6.0.
Confidential
Software Engineer
Responsibilities:
- Coding and implementation.
- Have written Unit Test Plan.
- Have used session bean as a business Object and deployed jar file.
Skills: JDK1.4.2, JSP,Oracle9i,Weblogic8.1,EJB2.0,Harvest.
Confidential .
Developer
Responsibilities:
- Coding and implementation.
- Have used EJB(stateless session bean) as a business Object.
- Prepared Unit Test cases.
Skills: JSP, EJB, JavaScript,Oracle9i.Struts1.1,Weblogic7.0.
Confidential .
Developer
Responsibilities:
- Coding and implementation.
- Have prepared Junit test cases.
Skills: JSP, JavaScript, Oracle8, Apache Tomcat 4.1
Confidential
Programmer
Responsibilities:
- Coding and implementation with Unit test case.
Skills: JDK1.3, servlet, JSP,,Oracle7,Swing