Data Analyst Resume
NY
SUMMARY:
- Around 18 years of experience in Analysis, Design, Development, Automation, Testing, Data Movement, Production Support, Job Scheduling, Application Performance Monitoring, Implementation and Support of IT applications.
- Last 7 years working in data movement and 3 years in Big Data development and data movement for data analytics.
- Self - motivated and high target oriented individual in application design and development
- Worked on projects to stream the data from RDBMS to HDFS/Hive through Flume/Kafka(Flafka) design.
- Experience in one-time bulk data movement through Mainframe JCL and Sqoop from RDBMS to HDFS
- Created design solutions in for data movement working with Cloudera in configuring Kafka Channel and Flume configuration properties and interceptors
- Created data linage diagrams and created meta data documents
- Worked on data compaction and data compression process
- Designed and Reviewed Scala programs which read from Kafka Channel and stream data into Hive through Spark Streaming
- Worked on data reconciliation program between MQ, HDFS and Hive
- Experience on IBM Mainframes using COBOL, PL/1, JCL, CICS, VSAM, DB2, IMS-DB/DC, Rexx CLIST, and IDMS.
- Strong Domain experience in Insurance, Banking, Bank Card, Finance, and Telecom applications.
- Experience in managing multiple z/OS LPARS and Virtual Machines.
- Prepared Architectural Diagrams, Process Flow Diagrams, High Level Design Documents and Technical Specification documents for various projects
- Conducted and participated in status meeting to collect data from business users
- Experience in application Performance tuning
- Validated component Spec documents created by IT developers
- Experienced in Onsite-Offshore model projects execution
- Well experienced in Job scheduling and Monitoring
- Provide technical consulting support on projects or systems issues.
- Strong knowledge in creating and maintaining test environments
- Implemented various projects in production, prepared and reviewed several implementation scripts, backup plans and did production support
- Experienced in requirement gathering from the user.
- Strong knowledge in Testing methodology and QA processing.
- Involved in Unit testing, System Testing, White/Black box testing, Integration Testing, functional testing, GUI / User screen testing and stress testing.
- Experienced in handling medium sized team.
- Created new test region for production parallel run including production data transfer, data masking and scheduling.
- Solved many Bridge Cases / STRs raised by the users.
TECHNICAL SKILLS:
Operating System: UNIX, MVS, MVS/ESA, OS/390, Z O/S, DOS, Windows NT/2000/XP/Vista
Mainframe Technology: TSO/ISPF, COBOL, PL/1, JCL, CICS, VSAM, DB2, IMS-DB/DC, Clist, Rexx, CSP, IDMS, ASSEMBLER, Easytrieve, NDM
Hadoop Skills: HDFS, Kafka, Flume, Hive, Impala, Cloudera Manager, Hue, Scala, Spark, Oozie, Snappy, AVRO, Parque and Sqoop
Configuration Tools: ENDEVOR, CHANGEMAN
Scheduler: Oozie, Crontab, OPC, CA7 and Control-M
Archival Tools: SAR
Testing Tools: XPEDITOR, INTERTEST
Database Tools: Hive, Impala, Apptune, QMF, STROBE, BMC, Platinum Tools, Mainview, Message Advisor, BMC Log Analyzer, IBM Admin Tool
Other Tools: IntelliJ, FILE-AID, COMPAREX, SUPER CE, INFOMAN, SYNC-SORT
Data Transfer Tools: MQ, Admin7, NDM, FTP AND XMIT, Informatica
Other Skills: ORACLE8i, SQL, VB, ASP, SQL 6.5, JAVA, HTML
Productivity Packages: HP Service Manager, Lotus Notes, MS Office and Outlook
PROFESSIONAL EXPERIENCE:
Confidential
Data Analyst
State Farm Insurance, USA
- Worked on Spark data streaming to stream data from RDBMS to HDFS/Hive through Flume/Kafka(Flafka) design
- Created one-time bulk data movement through Mainframe JCL and Sqoop from RDBMS to HDFS
- Created design solutions in for data movement working with Cloudera in configuring Kafka Channel and Flume configuration properties and interceptors
- Developed and Reviewed Scala programs which read from Kafka Channel and stream data into Hive through Spark Streaming
- Worked on data reconciliation program between MQ, HDFS and Hive
- Worked on security, Firewall setup, Performance testing in Hadoop
- Create and maintain 600 VMIPLEX(VM), 200 Agency CICS and 25 SIMPROD environments
- Run Production unloads from DB2 and IMS databases and load them to all the environments
- Apply DB2 and IMS production changes to all the test environments
- Prepared Architectural Diagrams, Process Flow Diagrams, High Level Design Documents and Technical Specification documents
- Conduct and participate in status meeting to collect data from business users
- Validate technical design documents created by IT developers
- Provides technical consultation and support in the development of a complex mainframe application.
- Assists customers with mainframe access problems
- Execute Disaster Recovery tests for Mainframe DB2
- Scheduling, Monitoring and troubleshoot the batch jobs
- Provide appropriate access to the DB2 tables and the datasets for the project
- Work with Service Provider like IBM to issue fixes to Mainframe
- Provide technical consulting support on projects or systems issues.
- Monitor and fix the DB2 and IMS Issues in the environments
- Update the project staging libraries to the environments
- Create and run the project specific flows on the environments
- Create project Databases, tables, Views, Triggers, Stored Procedure and load them with production/test data.
- Run Reorgs, Runstats, Binds, Grants, Unloads on Databases and tables.
- Extensively used the tools APPTUNE, Message Advisor, BMC Products, BMC Log Analyser, STROBE and In-house Tools
- Update Mainframe CICS with Programs, Screens, Map sets, Rates, etc.
- Update environments with simulated policyholder data provided by project
- Provide incident resolution of z/OS Mainframe environment related issues
- Apply weekly production updates to all CICS partitions
- Assist with Agency testing assets in the Testing Lab as needed.
ENVIRONMENT: UNIX, HDFS, Kafka, Flume, Hive, Impala, Cloudera Manager, Hue, Scala, Spark, Oozie, Crontab, Sqoop, MQ, ADMIN7, Z/OS, tso/ispf, COBOL, JCL, DB2, IMS-DB/DC, CICS, REXX, VSAM, SAR, FILE-AID, CA-7, BMC, IBM Admin Tool, XPEDITOR, APPTUNE, STROBE, MAINVIEW, TIVOLI, ENDEVOR, CONTROL-M, NDM
Systematics Maintenance and Support
Confidential, NY
- Interacting with client, business and users for new business requirements.
- Involved in development Activities in Fidelity Applications.
- Tracking the defects during the SIT/UAT, responsible for closing the defects and to get user and business signoff for the production implementation.
- Prepared functional specifications, test plans and review of test results provided by team.
- Prepared production implementation scripts, back up plans and COB update.
- Onsite/offshore support for batch and testing of SI applications.
- Provided KT to the Offshore Team and trained them to support the daily batch
- Responsible for daily batch cycle, which has more than 1000 jobs in OPC.
- Need to review the problem log sent by offshore team and provide permanent solutions for abends.
- Responsible for providing test environments for business / users to perform the testing.
ENVIRONMENT: MVS, COBOL, JCL, VSAM, intertest, SAR, FILE-AID, changeman, INFOMAN, NDM, OPC on S/390
Data AnalystConfidential, OH, USA
- Interacting with client for new business requirements.
- Created Design document and involved in Development, Testing, Implementation and Production Support.
- Involved in the creation of the System Document for each application.
- Interacted with the Application Managers to collect the information pertaining to their respective systems.
- Created Processing Flow diagrams and Architectural diagrams for various applications.
- Facilitated Onsite/Offshore Model by transferring Knowledge to the Offshore Team and resolved queries raised by them.
- Conducted Peer Reviews for the Documents and programs created by the Team.
- Involved in Analysis of Programs and JCL’s.
- Ensured Quality Deliverables on time and involved in QA Audit.
ENVIRONMENT: MVS, COBOL, JCL, VSAM, DB2 on S/390
Data AnalystConfidential, NY
- Developed new COBOL/CICS Programs and CICS Maps from CSP Code.
- Converted CSP DB2 queries to COBOL/CICS/DB2 equivalent queries.
- Developed automated scripts to convert CSP to COBOL.
- Involved in creation of flow chart for each individual program.
- Clean compile and unit test the migrated code.
- Provided support for System Integration Testing. (“SIT”) and User Acceptance Testing. (“UAT”) .
- Responsible for maintaining Quality Standards related to the project deliverables.
- Prepared and Reviewed Test cases and involved in QA audit.
ENVIRONMENT: MVS, CSP, COBOL, CICS, DB2, Intertest, changeman on S/390
Data AnalystConfidential
- Involved in the Analysis and Design of the Requirement.
- Coded New Programs and JCL’s.
- Solved bridge cases.
- Rationalizing the Programs.
- Preparation of System Test Plans and logging the test logs.
- Audited the team project.
- Actively involved in maintaining Quality Standards related to the project.
ENVIRONMENT: MVS, Easytrieve, COBOL, idms, JCL, ca7-scheduler, XMODS on S/390
Data AnalystConfidential
- Involved in Coding and Testing.
- Involved in Impact analysis, review, conversion and Testing.
- Responsible for solving the work packets at offshore.
- Involved in the preparation of Unit Test Plans and Defects log.
- Actively involved in maintaining Quality Standards related to the project.
ENVIRONMENT: MVS, COBOL, cics, DB2, JCL on S/390
Data AnalystConfidential, USA
- Involved in Coding and Testing.
- Involved in the Analysis and Design of the Batch Module.
- Worked on Batch Programs to convert VAX-COBOL to MAINFRAME-COBOL
- Program Documentation.
- Involved in the Preparation of Internal Technical Review (ITR’s) Documents.
- Involved in the Analysis, Design and conversion of VAX COBOL to M/F COBOL.
ENVIRONMENT: Mainframe - MVS, TSO/ISPF, COBOL, CICS, DB2, and JCL.
Data AnalystConfidential
- Involved in Coding, Unit Testing and Implementation
- Created Program Documentation.
- Prepared and Reviewed Test cases.
ENVIRONMENT: ASP, HTML, Java Script, IIS 5.0, Oracle 8.0