- 9+ years of comprehensive IT experience in Banking and financial services industry.
- 2+ years of experience in Big Data analysis using Apache Hadoop framework.
- Working knowledge of writing Map Reduce Programs in Hive, pig and java for data analytics.
- Experience in configuring the Clusters for the multi nodes in HDFS.
- Performed data analysis and scope using Hive, pig, Map - Reduce, Flume, Sqoop, and Oozie.
- Strong knowledge of Mainframe programming languages like COBOL, REXX and Ezytrieve.
- Led various phases of SDLC from requirement gathering to post implementation support, involving extensive interaction with client's businesses across multiple countries.
- Experienced in handling application and production support and incident tracking
- Involved in storage management and capacity planning.
- Responsible for Performance tuning & monitoring.
- Strong understanding of test life cycle, test management document and test plan creation.
- Exposure to Manual and Automation Testing. Test scenario finding, Test script writing
- Good team player and have ability to work independently in time sensitive environment.
- Ability to adapt to new technologies and oversee a project from conceptual to development phases successfully.
- Excellent working and leading skills on onsite offshore model.
BIG DATA ECOSYSTEM: Hadoop, MapReduce, HDFS, Hive, Pig, HBase, Zookeeper, Sqoop, Oozie, Flume, basics of spark and spark SQL.
LANGUAGES: SQL, COBOL, HIVE, PIG, JAVA basic, python, JCL, IBM REXX, Ezytrieve, Embedded C
DATABASES: DB2, VSAM, My SQL, ORACLE
OPERATING SYSTEMS: WINDOWS 2003/XP/7, LINUX, IBM Z/OS
TESTING TOOLS: HP Quality Center
METHODOLOGIES: V/Waterfall, Agile
OTHERS: TWS, B92, FILEAID, MINITAB, TSO, ENDEVOR, AbendAID, FILE Manager, ISPF, CICS (user interface), CA7,Echamps, Changeman, NDM, Microsoft (Word, Excel, Visio, PowerPoint), Eclipse, Tableau, Pentaho
Confidential, Tampa, FL
- Extensive experience in Hadoop map reduce, Hive as Programmer Analyst in business requirement gathering, analysis, scope, design, development, creation and execution Test Cases.
- Load and transform large sets of structured, semi structured data coming from different sources.
- Involved in loading data from Linux file system to HDFS
- Supported Map Reduce Programs those are running on the cluster.
- Cluster coordination services through Zoo Keeper.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Extensive data validation using HIVE, Python.
- Identified the insights and generated charts & reports using Tableau.
- Extensively used the Sqoop for importing and exporting the data from MYSQL to HDFS, Hive for analyzing the Data.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Conducted Knowledge Transfer for the team members on Hadoop Map Reduce programming and Configuration Files
- Worked hands on with ETL process. Report all the defects using HP service manager to respective teams.
- Work closely with business to sort the requirements and participate in data review and validation meetings
- Performed responsibilities of data integrity and data validity testing by following set guidelines
- Aggregating and analyzing key project and program metrics on a regular basis and producing summary reports as required, including the Program Dashboard.
- Daily/ weekly status calls to offshore team to track the status and issues.
- Environment: Hadoop, HDFS, Map Reduce, Hive, Python, Pig, Oracle, MYSQL, Tableau, Habse, Sqoop, spark, spark SQL, Linux, Eclipse, Tableau
- Development of code based on the approved technical specifications.
- Developed and modified SQL code to make new enhancements or resolve problems, as per customer requirements.
- Developed stored procedures, functions and packages for validating and migrating the data or target from staging.
- Created views on multiple tables.
- Created indexes on some columns of tables to increase the application’s performance.
- Migrated data from flat files to tables using SQL*Loader.
- Experience in data migration using import/export.
- Responsible for SQL tuning and optimization using explain plan and also responsible for writing test plans and unit testing for application modules
- Having good experience with client interaction and presentation skills
- Analyzing tables indexes for performance tuning.
Environment: SQL, Oracle, Windows XP
Senior Software EngineerResponsibilities:
- Developed COBOL programs and updated CICS programs that identified payments that could not be posted online and posted them in batch mode and generated necessary reports.
- Prepared functional and technical specification documents, conducted unit testing
- Ran data extracts from VSAM or flat files using utilities like File-Aid, DFSORT etc for clients and business partners.
- Coordinated with third parties like PLM, eCHAMPS, SWIFT etc., to define project responsibilities.
- Reviewed test plan with testing team managers and prepared SIT and UAT test environment
- Wrote ad-hoc queries in DB2 for business partners.
- Updated Quality center with issue logs.
- Involved in PSD production implementation and provided post production support
- Designed overall logic by reviewing functional specs and prepared technical specification documents for my module and obtained business signoff.
- Coded new COBOL and CICS programs as a part of implementing the new logic.
- Amended code changes and prepared new jobs from scratch and conducted unit testing
- Developed SIT test environment and regression testing and helped testing teams in test plan preparation.
- Coordinated project meetings involving business, test and development teams to monitor progress.
Environment: COBOL, JCL, REXX, VSAM, CICS, Ezytrieve, NDM, Changeman, Expeditor, HP QC.
- As the project lead oversaw the successful completion of all the project activities
- Coordinated with business teams to finalize functional and technical specification documents.
- Designed and implemented the Cobol programs to block future credit authorizations coming from demised cards
- Designed and implemented one time programs to purge existing accounts and related data
Environment: COBOL, JCL, VSAM, NDM, CICS, B92, Ezytrieve, CA-7, File-Aid, Changeman, HP QC.
- Planned and conducted a 3 weeks training for 5 Chinese associates
- Acted as a liaison between teams in India and China to complete the project
Environment: COBOL, JCL, VSAM, CICS, Ezytrieve, CA-7, TSO/ISPF, DFSORT, File-Aid, Changeman, HP QC.
- Involved in gap analysis, estimation, planning, execution and implementation of technical solutions to specific business requirements of the Confidential
- Responsible for production application issue resolution and disaster management testing & reporting
- Applied, tested and supported any compliance related changes.
- Led resource planning for work shifts, production, test batches and application issues etc.
Environment: COBOL, JCL, VSAM, CICS, DB2, Ezytrieve, File-Aid, Changeman, HP Quality Center.