Hadoop Technology Lead Resume
Peoria, IL
SUMMARY:
- 10+ years of experience in full life cycle of the software development process (SDLC) including requirement gathering, solving, innovating, analyzing, designing, developing, supporting, writing technical/system specifications, interface development and implementing many successful Mainframe, distributed and Big Data applications.Skilled in Hadoop (Cloudera and MapR) architecture and various components such as HDFS, Yarn, Oozie, Hive, Impala, Pig, Sqoop, and Map Reduce programming paradigm.
- Experienced in big data analysis and developing data models using Hive, Map reduce, SQL with strong data architecting skills designing data - centric solutions.
- Experience providing technical and warranty support for applications, resolving production issues, conducting root cause analysis, adhering to service level agreements, and ensuring high service availability.
- Excellent planning and organization skills, excellent leadership, communications and interpersonal Skills, and an ability to work under strict deadlines.
- Have held multiple technical lead and business analyst roles, that allow to utilize his senior technical ability while effectively leading global team initiatives driven by the business partners.
- Extensive Digital Marketing experience, and is highly skilled in the distribution of data to various downstream marketing systems for analytical purposes.
- Analytical and experienced Information Technology and business analyst professional, dedicated and self-motivated to meet growing demands of businesses and modernize legacy application with new technology solutions and innovations.
TECHNICAL SKILLS:
Big Data Technologies: Hadoop, HDFS, Hive, Impala, Map-Reduce, Oozie, Pig, Sqoop, Hbase, Spark, Python, Shell Scripting
Core Java/Web Technologies: Core Java, JSP, Servlets, XML, JDBC, JSON, Webservices, HTML, CSS, JavaScript, Tomcat Web Server
Database and Search engines: DB2, Oracle, Solr, Elasticsearch, Mongo DB, IMS, MySQL, SQL
Mainframe Technologies: COBOL, JCL, Syncsort, SPUFI, Apptune, File-Aid, QMF
Development Tools: Eclipse, SAS Enterprise Studio, SVN, Putty, SOAP UI, SQL Developer, Rally, Mingle, VSTS
Schedulers: Control M, JobTrac, Oozie, Crontab
Operating Systems: Unix, Linux, Shell Scripting, Mainframe, Windows
Project /Change Management and support: Onsite Coordination, Requirement Elicitation, Effort estimation, Project tracking and execution using SDLC,Agile-Scrum and Waterfall methodology, Application support and change management using ITSM tools like servicenow
PROFESSIONAL EXPERIENCE:
Hadoop Technology Lead
Confidential, Peoria, IL
Responsibilities:
- Working on a Cloudera Hadoop Distribution (CDH 5.8) platform supporting and implementing Big Data solutions using Impala, Hive, Oozie, MapReduce, Shell /Python scripting, and Java technologies.
- Working on data analysis, data modelling and developing monitoring and data visualization solutions for daily file processing jobs.
- Support and resolve technical issues and address process failures in the application on daily basis.
- Working with development and support team to help create, analyze and validate business requirements and translating them to functional stories in Mingle.
- Analyzing and resolving data issues related to data quality, integrity and availability.
- Providing performance analysis of oozie workflows and file processing jobs in development and production environment.
- Providing solution architecture for data migration and support from On-prem to Cloud AWS platform.
- Helping data scientist create tableau dashboard for application monitoring and reporting purposes.
- Provide solutions for data conversion and table updates on Hadoop platform.
- Working on creating operational visibility solutions which enables reporting of Hadoop application status, create incidents for failures and ensure service level agreements are met by timely issue resolution.
- Analyze and create solution diagrams and documentation for business presentations.
- Provide optimization recommendations and solutions for existing processes and designs.
- Assist infrastructure support team in incident resolution and root cause analysis, ensuring high service availability. Leading and coordinating with offshore support team
- Do code-review, testing and co-ordinate with team for quality deliverables.
- Support production released from dev/test cluster to production cluster.
Environment: Hadoop, Impala, Cloudera, HDFS, Hive, UNIX, Python, Shell Scripting, Oozie, Java, Mingle
Big Data Developer/Lead
Confidential, Phoenix, AZ
Responsibilities:
- Worked on a MapR Hadoop platform to implement Big Data solutions using Hive, MapReduce, Shell scripting, and Java technologies.
- Worked on data analysis, data modelling and developing ETL logic using Hive, Shell Scripting and Java UDFs
- Working with data lake architecture, developing services to ingest and extract data feeds into application use case area from RDMS databases like Oracle.
- Working with Elasticsearch as an enterprise search engine on top of Hadoop infrastructure.
- Developing bash shell scripts invoking hive HQL scripts and creating appropriate dependency.
- Working with Data scientists on migration of traditional SAS code into Hive HQL to run on Hadoop platform with higher efficiency and less time.
- Batch job scheduling using Crontab and Oozie workflows.
- Analyze and create solution diagrams and documentation for business presentations.
- Provide optimization recommendations and solutions for existing processes and designs.
Environment: Hadoop, HDFS, Hive, UNIX, Shell Scripting, Oozie, Hive, Sqoop, Core Java, Webservices, Elasticsearch, SAS, Oracle, Servicenow, Rally.
Technology Lead
Confidential, Phoenix, AZ
Responsibilities:
- Created POA roadmaps for the architecture solutions to reduce the mainframe footprint of the application and move to open source solutions
- Worked as a key contributor to application solution architecture, created design solutions which will result into application infrastructure cost saves of about 1 million dollars per year..
- Created solution diagrams and documentation.
- Created data import queries for Solr import from mainframe DB2.
- Solution and tune the document updates into Solr index
- Designed Web service solutions for external applications.
- Leading and coordinating with offshore development team for development and unit testing.
- Do code-review and co-ordinate with QA team for system and user testing.
- Updated rally for project status and documentation and attended daily scrum ceremonies.
- Planned production release and implementation, post implementation validation and incident support using servicenow.
Environment: Solr, Mainframe, COBOL, JCL, DB2, Java Restful Web Services, JavaScript, Servicenow, Change management
Technology Analyst
Confidential, Phoenix, AZ
Responsibilities:
- Project estimation and business requirement documentation.
- Analysis, design of the high performant solutions along with Tech architects and create required design documentation.
- Provided innovative solutions to improve query performance by implementing search engine solutions on Solr. This resulted into mainframe CPU cost saves of about $200K per year
- Established balancing and control process between applications and devised data flow strategies.
- Enhanced the application to simplify the architecture and reduce the application processing time.
- Designed DB2 tables to support the application data storage and processing.
- Worked on Rest-Webservice specification for interaction between web UI and DB2 and implementation of SOA.
- Worked with offshore team to develop and test nightly batch processing modules using JCL and COBOL..
- Project change management, Implementation and tracking delivery timelines. Investigate on root cause analysis and provide application support.
- Data analytics to support business decisions and future enhancements.
Environment: Mainframe, DB2, COBOL, JCL, CONTROL-M Spring MVC, Web services. HTML, CSS, JavaScript, servicenow
Technology Analyst
Confidential, Phoenix, AZ
Responsibilities:
- Played the role of onsite coordinator for the India based offshore team.
- Collect business requirements and translate them to use cases for design and development.
- Created design solutions for performance improvement of nightly batch jobs moving data from OLTP to OLAP tables resulting into $500K CPU cost saves per year.
- Worked on tuning of SQL to enhance performance of the queries to adhere to the strict SLA of sub-second response time.
- Devised strategies to provide merchant name search performance enhancement, multiple DB2 strategy tables were created to store counts of merchants with different names to maintain name search query response time to be sub-seconds.
- Worked with Data Power team to create Webservice specifications for the client website.
- Worked on improving data quality of merchant name and addresses by developing processes and interfaces with the data cleansing vendors like Acxiom.
- Created processes and services to get geo-location data from providers like Google and store it in mainframe DB2 tables.
- Enhanced the application to provide cleansed merchant data to marketing applications via SFTP file transfer.
- Provided data analytic supports to understand the data and help business take crucial data decisions.
Environment: Mainframe DB2, SQL PL, COBOL, JCL, Data Power, SOAP APIs, SFTP, CONTROL-M
Developer and Senior Developer
Confidential
Responsibilities:
- Worked on development of JCL-Batch and IMS-MQ transaction processing systems
- Worked on Design, coding and implementation of application along with required documentation
- Created test case documents and captured unit testing results.
- Worked with onsite coordinator to understand design requirements to accomplish business goals.
- Used tools like Changeman, IBM data studio extensively for build and testing.
- Implemented cost saves suggestions and performance improvement ideas.
- Planned and created batch job schedules through JOBTRAC
- Lead team of junior developers as a mentor and a guide.
- Considered SME of the project, problem solving and an efficient team member.
Environment: Mainframe, COBOL, DB2, IMS - MQ, JCL, JOBTRAC