Etl/bigdata Developer Consultant Resume
MA
SUMMARY:
- Around 18 years of experience in IT Industry of which around 2.5 Years of experience in Big Data analytics
- Worked with the tools in Hadoop Ecosystem including Hadoop/SPARK, SQOOP, Python, Hive, HDFS
- Migrating legacy ETL Datastage projects to HADOOP projects
- Ability to move the data in and out of Hadoop from various RDBMS using SQOOP and other traditional data movement technologies using APIs
- Have hands on experience in writing Python scripts in Spark frame work
- Used UNIX/Edge Node as a gateway between Hadoop Cluster and
- Hands on Experience in working with ecosystems like Hive and Sqoop
- Played multiple roles - Application Architect, Advisory Systems Analyst, Project Lead, SME, BSA, Developer, SPOC for BCM and Data Security and Privacy etc
- Worked extensively on IBM InfoSphere DataStage 8.7 and 11.5 using Components like DataStage Designer, DataStage Director, DataStage Administrator & Information Server, Unix
- Extensive experience in Unix, COBOL, CICS, JCL, DB2, Natural, Adabas, Stored Procedure, VSAM, IBM Mainframe Systems, Oracle, Netezza, SQL Server, DB2, XML files, COBOL/Mainframe files and Flat files.
- Gathering requirement from Business and prepare Design Document
- Create/update Architectural designs and ER diagrams
- Coding and Testing of batch/online programs, UNIX scripts
- Preparation of Job flow diagrams and Entity relationship diagrams using Visio
- Hands on Experience in Debugging tools such as MFE, Xpeditor and Inter-Test
- Hands on Experience in Development tools such as RDz, MFE, Endevour, File-Aid, SPUFI and SDF
- Developed and executed Unit Test Plans and Unit Test cases based on the business and requirements
- Performed Functional testing, Integration testing and Regression testing based on project requirement
- Well versed with writing SQL queries and DB2 Stored Procedures
- Worked on projects with domains including Finance, Retail and Distribution, Automotive, Banking, Credit cards and Telecom
- Review of Deliverables and Defect logging and Conducting Defect Prevention meetings
- Conducted Defect prevention meetings, prepared Pareto Chart and Fish bone diagram for defect analysis
- Prepares Project’s Quality Metrics, participates in PMR internal and external audits
- Executed a Six Sigma Project on Code Review Defects
- Certified as Quality Ambassador from IBM
- Trained on CSQA and Project Management Program
- Trained in Agile practices, facilitates scrum and played Scrum master role
- SPOC for the Data Security and Privacy requirements of the project
- Part of Appraisal process, Add projects to portal, Update PMO, Timesheet approvals etc
- Maintaining Business Continuity Management (BCM) artifacts for the project
- Being the BCM SPOC coordinated the execution of BCP testing and shared reports with client
- Participates in recruitment drives conducted by Employer and Projects
TECHNICAL SKILLS:
Technologies: Hadoop, SPARK, IBM Infosphere Datastage(ETL), JAVA and IBM Mainframes
Autosys, CA: 7, Oozie
Database & File Systems: Oracle 10g/9i/ 8i/8.x/, Netezza, SQL Server 2003,DB2, MySQLADABAS,VSAM, Teradata, Flat files
Languages: PYTHON, SCALA, Cobol, CICS, JCL, JAVA, Natural, Unix
Tools: Yarn, SQOOP, JIRA, Aginity, ISPF, SQUFI, ENDEVER, FILEAID, XPEDITOR, CFI, NDMBMC Unload, CA7, SYNCSORT, EST, ICETOOL, SOA, XML, Web Services, MFE (Mainframe Express), RUMBA, STS(Spring Tool Suite), Jenkins Enterprise
Methodologies: AGILE, Star Schema, Snow Flake Schema.
PROFESSIONAL EXPERIENCE:
Confidential, MA
ETL/Bigdata Developer Consultant
Responsibilities:
- Created Pyspark programs to perform analytics on various sources of data.
- Loaded the data from RDBMS to HIVE/HDFS using SQOOP
- Using UNIX Edge node as a gateway between Hadoop cluster and external applications
- Used GIT version control tool to save the Python related to the loaders
- Created Spark SQL to meet all transformation requirements.
- Created python script using pandas libraries for less volume datasets
- Participate in requirement discussions with business
- Create GWT(Given When Then) scenarios for the requirements
- Follow ATDD(Acceptance Test Driven Development) methodology in project
- Train new resources on Media-in and Media-Out application
- Trained in Agile practices and facilitates scrum
- Adhering to Shift-left principles for testing
- Analyzing the Mainframe (COBOL,JCL,DB2,Stored Procedures), ETL-Datastage, Java, Unix scripts
- Provide design and development solution to requirements
- Preparing Data Mapping document and design specifications
- Preparation of estimate to the tasks assigned and Tracking the status
- Develop Datastage jobs for Extraction, Transformation, Cleansing and creating files for downstream Legacy applications
- Profiling and analyzing data from different sources, addressing the data quality issues, Transforming and processing of data and loading the data into DB2 tables.
- Captured data from a variety of sources DB2, Flat Files, Mainframes and other formats.
- Extensively worked with Data Stage Shared Containers for Re-using the Business functionality.
- Support Integration and User Acceptance testing
- Develop best practices, design standards and apply them to the project
- Follow the Quality process / Key Controls mandated by client and deliver high quality deliverables
- Perform Elevation preparation tasks and take part in Elevation
- Create elevation packages and Change record for Deployment
- Perform Production job monitoring and bug fixing
- Support JAVA modernization project
- Answer MI/MO CRF related queries from other teams like JBETL, IESS, NQP, CRA, Data-Architects etc
Environment: Spark, Hadoop, IBM InfoSphere DataStage 8.7, IBM Mainframes, DB2, COBOL files, Sequential files, DB2, JCL, XML files, CICS, COBOL, DB2, VSAM, JCL, SPUFI, QMF, ISPF, JAVA, Insync, Control-M, Putty for Unix, STS(Spring Tool Suite), Jenkins Enterprise, Hive, UNIX/Edge Node, Yarn
Confidential, NJ
ETL Consultant / Computer Systems Analyst
Responsibilities:
- Understood the technical specifications and develop data stage jobs for Extraction Transformation, Cleansing and Loading process of DW.
- Profiling and analyzing data from different sources, addressing the data quality issues, Transforming and processing of data and loading the data into DB2 tables.
- Captured data from a variety of sources including Oracle, Netezza, DB2, Flat Files, Mainframes and other formats.
- Extensively worked with Data Stage Shared Containers for Re-using the Business functionality.
- Used extensively Reject Link, Job Parameters, and Stage Variables in developing jobs.
- Drafting technical documents like Overview documents, migration and deployment documents for every code release.
- Automation is done by using batch logic, scheduling jobs on a daily, on a weekly and yearly basis depending on the requirement using Autosys.
- Created database tables and used PL/SQL procedure for validation report, Balancing report, Audit report
- Work with Mainframe upstream Jobs using JCL, to make changes and refactor them as needed
- Involved in various reviews and meetings including Internal and external code review, weekly status calls, issue resolution meetings and code acceptance meetings.
- Assisted SIT testing team, UAT team and Production team during code release with code walk through and presentations and Defect identification, reporting and tracking.
Environment: InfoSphere DataStage 8.7, Mainframes, CICS, TSQ, TDQ, COBOL, DB2, VSAM, JCL, SPUFI, QMF, ISPF, INTERTEST, XPEDITOR, CHANGEMAN, Oracle 11g, Netezza, Fixed width files, COBOL files, Sequential files, DB2, JCL, XML files, Aginity
Confidential, Wayne, NJ
Advisory Systems Analyst
Responsibilities:
- Responsible for managing scope, planning, tracking, change control, aspects of the project
- Analysis of Business requirements and Specifications
- Preparation of Estimate for the requirements received from client
- Creating Datastage Jobs and UNIX scripts
- Coding online and batch programs using CICS, COBOL, DB2 and Stored Procedures
- Preparation of test cases, Unit testing and regression testing
- Perform Unit testing, System and Integration testing
- Support users in User Acceptance testing
- Review of deliverables and logging the review comments
- Tracking in process and post release defects
- Production Job monitoring and Bug fixing
- Perform Root cause analysis for the post release defects
- Responsible for effective communication between the project team and the customer
- Translate customer requirements into formal requirements and design documents
- Establish Quality Procedure for the team, monitor and audit to ensure team meets quality goals
- To perform the role of a team lead managing the work allocation, mentoring, ensure co-ordination amongst the team members, gain confidence of the onsite SMEs/leads/business
- To participate actively in team/customer meetings and ensure co-ordination between onsite/offshore team
- To make sure the commitments made to the client, quality of deliverables are met
- Mentor the team in technical/business areas and help them resolve issues related to it
- Review work status and assist the team in all phases of the software engineering cycle as and when required
- Responsible to adhere to the Quality procedures related to project and organization
- Responsible for sending Project’s Quality Metrics to Quality management team
- Facing Internal & External Audit for the project
- SPOC for the Data Security and Privacy requirements of the project
- Maintaining Business Continuity Management artifacts and data for the project
- Being a BCM SPOC coordinates for BCP testing
- Interviewing new resources for the account and organization new recruits into Mainframe technologies, tools and utilities
Environment: CICS, TSQ, TDQ, COBOL, DB2, VSAM, JCL, InfoSphere DataStage 8.7, UNIX, CA-7SPUFI, QMF, ISPF, INTERTEST, XPEDITOR, CHANGEMAN
Confidential, Bentonville, AR
Systems Analyst / Project Lead
Responsibilities:
- Work with Business Analysts in translating business requirements into Detailed Design Documents
- Lead analysis sessions, gather requirements and write specification and functional design documents for enhancements and customization
- Coding of new Programs, Maps, JCLs, Stored Procedures
- Preparation test cases and test reports
- Coordinate and communicate tasks with developers
- Ensure that development is performed as per requirements
- Work with QA to create test scripts and scenarios for enhancements and customizations to the core product
- Involved in Development of new modules and Implementation
- Communicate activities/progress to project managers, business development, business analysts and clients
- Develop implementation and test plans, build software acceptance criteria, coordinate and work with clients to oversee the acceptance and dissemination process
- Performing Impact Analysis
- Review of deliverables and logging the review comments
- Defect Logging and conducting defect prevention meetings
- Studying Business Requirement and Preparation of Estimate
- Analysis of the specifications provided by the clients
- Task allocation and project tracking
- Testing the developed modules
- Preparation of Weekly status report and updating onsite
- Involved in peer review
Environment: CICS, MQ Series, COBOL, DB2, VSAM, JCL, Natural, Adabas
Confidential, FL
Senior Software Engineer
Responsibilities:
- Analyzing the Business requirements and Estimate the Time and Effort based on the
- Complexity Analysis
- Preparation of Detailed Design Specifications Document
- Unit Test Plan is prepared as per the functionalities available in the Legacy screen
- Preparing the List of all the fields in each Legacy screens and changing it according to the
- Format in Visual Basic platform
- New migrated modules will have three tier architecture; Tier-I will have the Front End with GUI based Screens, Tier-II will have the Business Logic & Functionalities and Tier-III will have the File I-O operations other Data processing Functions.
- Functionalities will be tested as per the Cases given in Test plan ;
- Quality Assurance team will be testing the same modules again & their comments need to be addressed
- Reviewing the deliverables of the team and Defect were logged in the Defect log
- Preparation of Weekly status report and updating onsite
- Coordinating the Team and ensuring the Completion of Task assigned
Environment: CICS, DVS, TSQ, TDQ, BMS, DB2, VSAM, JCL SPUFI, QMF, FILEAID, N2O
Confidential, CO
Programmer Analyst
Responsibilities:
- Doing Impact Analysis for the new changes requested by customer, Preparing the List of modules impacted
- Preparation of Low Level Design based on the High Level Design Specification sent by onsite coordinator
- Unit Test Plans were prepared as per the requirement given in the Business requirement specifications and Low level Design specifications
- Coding of new programs and Code changes of existing programs will be done as per the code standards given by customer
- Unit testing will be done for various cases test cases in test plan, Testing will be done for several times to ensure the accuracy of output
- Involved in Code review and functional review, Defects will be logged in to Defect Log
- Release notes will be prepared for all the elements which will be delivered
Environment: COBOL, CICS, DB2, DATACOM, VSAM, JCL SPUFI, INTERTEST, FILEAID, DFSORT, PLATINUM, ENDEAVOR
Confidential
Software Engineer
Responsibilities:
- Studying Business Requirement and Preparation of Estimate
- Tracking the tasks received from client
- Task allocation to the team
- Analysis of the specifications provided by the clients
- Design and Development
- Task allocation and project tracking
- Coding new programs/ Modifying the existing modules
- Testing the developed modules
- Preparation of Weekly status report and updating onsite
Environment: COBOL, CICS, DB2, DATACOM, VSAM, JCL SPUFI, INTERTEST, FILEAID, DFSORT, PLATINUM, ENDEAVOR