We provide IT Staff Augmentation Services!

Programmer Analyst Resume

4.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 8+ years of experience as a Business System Analyst/ Data Analyst/ Scrum Master in multiple domains including Banking and Aviation and strong understanding of Business Requirement Gathering, Business Process Flow, API documentation, development, Business & Data Analysis.
  • Possess strong communication, analytical and problem - solving skills with superior leadership, team collaboration, training and management skills to build consensus across relevant parties and derive business success.
  • Certified Scrum Master with over 4 years of experience working with self-organizing and cross functional teams. I have expertise in SAFE (4.0) and Agile scrum environments. In depth knowledge of Software Development Life Cycle (SDLC) process which includes Waterfall and Spiral and experience in Kanban, Six-Sigma and Waterfall-Scrum Hybrid methodologies.
  • Excellent Business writing skills in writing Business Requirements Document (BRD), Use Case Specifications, Functional Specifications, Systems Requirements Specification (SRS), Data Requirements and Workflows.
  • Great understanding and knowledge of Scrum framework including Scrum Ceremonies, Artifacts, various Prioritization and Estimation techniques viz. Kano, Moscow, Planning poker, Relative mass, T shirt sizing.
  • Assisted Project Managers in preparing Project Plans, Project Schedules, establishing Milestones, Work Breakdown Structure (WBS), maintained RACI chart to manage overall project risks and mitigation plans.
  • Conducted Impact Analysis, Cost Benefit Analysis, GAP analysis, Root Cause Analysis, Feasibility Study, As-Is and To-Be business analysis, Risk Analysis, SWOT analysis and Force Field Analysis.
  • Excellent knowledge in dataflow analysis of the system using use case diagrams, dataflow diagrams, state diagrams, activity diagrams, flowchart diagrams, sequence diagrams, and collaboration diagrams.
  • Worked extensively with the team to meet business requirement and analyzing workflow and Worked with project teams and architects to develop business process models.
  • Maintained a structured approach in organizing requirements to ensure that critical business rules and requirements are met with the help of Requirement Traceability Matrix (RTM).
  • Strong experience in Data Profiling, Data Migration, Data Conversion, Data Governance, Data Integration, Data Modeling, Data Warehousing, Metadata Management Services, Master Data Management and Configuration Management.
  • Experience in Relatonal Databases like Oracle 11g/12c, PostgreSQL, SQL Server, MS Access, Dynamo DB and very efficient in tuning the SQL Queries and using several tools and utilities. Utilized Joins, NVL statements, Union, views, functions, procedures, cursors, triggers in creating complex SQL and PLSQL queries
  • Extensive experience in Data Warehouse and ETL process using Pentaho Kettle and Informatica Power Center and used it to extract the data from source systems, transform it according to business rules and load it into the DWH.
  • Expert in Data Conversion, Normalization, Denormalization, Multi-Dimensional Modeling and involved in creation of Fact Tables, Dimension Tables, Star Schema and Snowflake dimensional schema using Erwin Tool.
  • Expert in R programming and SAS Enterprise Miner for statistical data analyst, Tableau and Power BI for Data Visualization, Python for Data mining and Adv. Excel for creating interactive workbook by using various functions like Pivot tables and charts, Filter and Sorting, Conditional Formatting, Vlookup, logic functions and removing duplicate functions.
  • Cultivated experience in working with different web services such as HTML, WSDL, SOAP, UDDI, REST, JSON under various architectures like Three-Tier Architecture and Service Oriented Architecture (SOA) for efficient information transfer and mapping for the application.
  • Performed API testing using Postman to check if the API’s are returning the data in JSON, XML format as defined in the API documentation and required by workflow.
  • Well versed in conducting various types of testing including Unit Testing, User Acceptance Testing (UAT), Black Box Testing, Stress Testing and Regression Testing and helped in designing and developing test plans and test scripts.
  • Managed requirements, tracked defects and documented performance reports. Determined the priority of defect along with developers and managers based on business value of the functionality and helped in resolving it by efficient tracking in JIRA.

TECHNICAL SKILLS

Programming: R Programming, SQL, LINIX/UNIX, Python, Java, XML, JSON, HTML, CSS

BI tool: Tableau, R Studio, Power BI, Splunk, SAS Enterprise Miner, Adv. Excel., Jupyter

ETL tool: Pentaho Data Integration, Informatica Power Centre, Control M.

Database: PostGreSQL, Oracle 11g/12c, SQL Server, PL/SQL, PLpgSQL, NoSQL, Teradata.

AWS Services: Redshift, DynamoDB, S3, EC2, EMR, ELB

Modelling tools: Erwin, ER studio, Visio, Balsamiq, Mockup Screens, Rational Rose

Methodologies: SDLC-Waterfall, Agile-Scrum, Scrum/Waterfall Hybrid, Scrum/Kanban Hybrid, Scaled Agile Framework.

Business Skills: Impact Analysis, JAD Sessions, Change Management, SWOT analysis, Project Planning, Project Scheduling, and Project Budgeting, Cost Benefit Analysis, Root Cause Analysis.

Project Management: Atlassian (Jira, Confluence), Rally, MS Project, HP ALM, SharePoint.

Testing Tools: SOAP UI, HP QC v10.0, Cucumber, Selenium, QTP, Load Runner

Machine Learning: Classification, Regression, Clustering, NLP Text Mining, Time Series Analysis, Spatial Analysis, Decision Models (Excel-QM/Solver)

PROFESSIONAL EXPERIENCE

Confidential, Plano, TX

Programmer Analyst

Responsibilities:

  • Primarily involved in providing a high-quality technological solution for operational activities of ‘Chase Paymentech’.
  • Assist in design and development of command macros in Stratus VOS to automate system checks and monitoring.
  • Design and develop Data validation, load processes, test cases and error control routines using PLSQL/SQL.
  • Develop SQL script to help support team identify the transaction patterns for every merchant.
  • Develop and maintain interactive dashboard in Splunk totrack daily/monthly transactions, updates/trends and data quality based on business requirements.
  • Design ETL Informatica workflows and mappings, incorporating complex transformation methodologies resulting in the development of efficient interface between source and target systems in a large scale Unix/ RDBMS environment.
  • Create Control-M jobs to manage, schedule and monitor Informatica workflows in a production environment.
  • Provide level 2 technical support to E-commerce (Card not present), Retail (Card Present) and International merchants by overseeing escalated issues from start to resolution and partner with various Service Department to ensure escalated issues are resolved.
  • Assist on network issues of different protocols when coordinating with Network Support towards a solution and root cause analysis.
  • Participate in upgrades to ensure compliance with current rules and regulations for the banking industry as set forth by local, state, national, and international governments.
  • Document problems, steps taken, and resolutions of issues preventing the client from sending or receiving files recognized by management for identifying problems and providing resolutions in a timely manner.
  • Primarily involved in Job design, Technical reviews and Troubleshooting of jobs. Extensively involved in different team review meeting and conferences with a remote team.
  • Train new personnel in the management of file transmissions, demonstrating different methods of research.

Environment: - MS Office (Excel, Power Point, Word), HP SM, IBM Netcool, Geneos, Control M, Informatica Power Center, Splunk, Veritas Netbackup, Stratus VOS, UNIX/LINIX, Confluence, SharePoint, JIRA, Symphony, Adobe Chat.

Confidential Columbus, Ohio

Business Analyst

Responsibilities:

  • Act as a Liaison between Business users and technical team. Spearhead the collaboration between off-shore and on-shore stakeholders.
  • Analyzed the As-Is and To-be business process of Mortgage eXpress (MX) to understand the business process and workflow.
  • Analyze previously existing defect logs in ALM and worked closely with QA to identify the steps to reproduce in lower environment (IST1, IST2, UAT).
  • Investigate the most recent production occurrences, in case the defect is not replicable in lower environment.
  • Generate various leads in Lower environment to identify all possible scenarios in which defect persist.
  • Communicate with developers and SME’s to perform the Root Cause Analysis (RCA).
  • Perform Impact Analysis to identify compliance, business, pipeline and performance impact due to defect.
  • Create User Story along with Acceptance criteria, background, details, QA approach, Dev approach in Jira based on the findings from defect logs in HP ALM.
  • Plan organize and conduct Backlog Grooming Sessions with business unit and technical team to give a walkthrough and to bring a common understanding across teams of requirements, approvals and sign off.
  • Functioned closely with the Business folks and SME’s to understand defects related to HMDA compliance, TRID regulation (TILA, RESPA Integrated Disclosure), HARP Loans and HBA Loans.
  • Update tracker for sprints traceability and transparency.
  • Assist QA in writing test scripts, test plans, test cases, followed Acceptance Test Driven Development (ATDD) in our project.

Environment: - Agile-Scrum, MS Office (Excel, Power Point, Word), JIRA, HP ALM, SharePoint, MAX (Mortgage Application eXpress), MPX (Mortgage Processing eXpress), Expere, PCE, MyCM, SOA.

Confidential, Boston, Massachusetts

Business System Analyst

Responsibilities:

  • Work with AML analysts, AML investigators, AML quality and management team, UX/UI teams, content teams and senior stakeholders to understand their needs and started gathering requirement.
  • Conducted Requirement Workshops and JAD sessions for requirement gathering with entire team members.
  • Reviewed a firm’s compliance with AML rules under FINRA and BSA, which sets forth minimum standards for a firm’s written AML compliance program. The basic tenets of an AML compliance program under FINRA are risk based CIP, CDD, EDD, CTR, Ongoing Transactions, reporting suspicious activity, etc.
  • Create process and data flow diagrams, use cases, Activity diagram, UML diagrams and user guides for the program.
  • Coordinate closely with data management team to define data requirements.
  • Assisted PO in breaking down the epics; creating and further splitting user stories using vertical slices that deliver’s business value and defining Acceptance criteria in Backlog grooming Session.
  • Kept track of Issues, open requirements, delivery dates, reviews and sign-offs and communicated delays.
  • Created mock ups and wireframes using Balsamiq for stakeholders, compliance analysis, AML analyst, development team, Design team and AML investigators.
  • Maintained the Requirements Traceability Matrix (RTM) across the deliverables of a project to keep track of the origin, status and end to end tracking of requirements.
  • Developed SQL queries, procedures, NVL statements, Joins (Inner, Outer, Left, Right), CTE’s to manipulate and extract data and also to test the data based on the functional requirements.
  • Worked on API/Web Service Interface Specs in JSON, XML, Swagger Documents, Sublime text and supporting the development of them.
  • Utilized Swagger to document SOAP and REST based request and responses which consisted of one sentence description of the service, Request object, Response object and the errors to be returned in case of failure. Integrated system using REST/SOAP web service API call.
  • Performed detail analysis of Suspicious Activity Monitoring and Currency Transactions Reporting alert module in order to gather requirements for new functionalities and reporting needs.
  • Involved in development of Test plans, Test case and Test procedures in manual and automated test environments for different testing procedures like black box testing, stress testing, and regression testing.
  • Worked with application development team on Test issues and conducted daily status meetings with Development and User Team during UAT

Environment: Angular 2.0, World Compliance, AML insight, Oracle WebLogicapplicationserver, Oracle 11g RDBMS, Java EE, Enterprise Java Beans (EJB), trulioo’s REST API, Oracle, HTML 5, CSS3, JavaScript and JQuery Bootstrap framework (v3.3.4), Web storm, Jenkins, Git, Balsamiq, Informatica power center 8.6, Erwin, Atlassian (Jira and Confluence).

Confidential, Arlington, VA

Business Data Analyst

Responsibilities:

  • Understood the existing business process to identify how clients will be benefited from the new web application platform.
  • Performed Gap analysis to understand As-Is business process and To-Be business process and created business process model and notation (BPMN).
  • Conducted JAD session with the committee of Subject Matter Expert from various business area’s to obtain domain level information, interviewing and asking detailed questions and carefully recording the requirements that can be reviewed and understood by both business and technical team.
  • Drafted Business Requirement Document, Functional Requirement Specification by coordinating with application development team, Analytics team, Internal and external stakeholders.
  • Participated in Backlog grooming, daily scrum, sprint planning, sprint review meetings and sprint retrospective meetings to report on the progress of the project and raise red flags if any.
  • Helped the PO and team to split user stories into tasks meeting the SMART criteria into vertical slices that produced maximum business value. Also helped in explaining the business context to other team members.
  • Created use case diagrams, activity diagrams to get the Business user’s perspective of the application.
  • Designed Wireframes and Mockup screens for the developers to build multiple Windows Presentation Foundation (WPF) Forms, Custom Controls, User Controls and suggested Master Page themes.
  • Performed Source to Target data analysis and data mapping and created mapping documents.
  • Created logical data model from the conceptual model and its conversion into the physical database design using Erwin.
  • Built various end to end Data warehousing solutions using Pentaho PDI kettle to extract data from various aircraft informational sources and populate them into our PostgreSQL database.
  • Create transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update, add constants, Filter, Value Mapper, Stream lookup, join rows, Merge join, Sort rows, String Operations, Database Lookup, Set Environment Variables.
  • Designed various SQL queries to validate Data transformation and ETL Loading.
  • Employed best practices in creating a complete schema including tables, relationships, stored procedures, views, clustered and non-clustered indexes and triggers using PlpgSQL as per project needs.
  • Optimized the performance of queries with modification in PLpgSQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized table, established joins and created Clustered, Non-Clustered indexes whenever necessary.
  • Assisted the technical architect to map the services within web application for efficient information transfer.
  • Analyzed various Aircraft lessors and Manufacturers, created interactive dashboard in Tableau to present them to client.
  • Monitor client activities in mbaRedbook Website and application by analyzing raw data extracted from Intercom.
  • Developed API that meets the client’s needs, documented the API in Technical Requirements document.
  • Assisted in preparation of User Acceptance Testing (UAT), test plan, Test Cases, Testing Life Cycle schedules and implemented all the test scenarios with the help of QA Team and reported bugs in JIRA.
  • Tested end to end flow of data, web form links & click-events in the application after integration of modules.
  • Actively used Atlassian tools like Confluence and Jira to track and manage all the projects related to ‘mba Redbook’ App.

Environment: Pentaho Kettle 4.2.1, PostgreSQL 9.4, MySQL 5.7, R Studio, MS-Excel 15, Tableau Desktop 9.0.1, Dbeaver4.0, Content grabber, Erwin r9.6, Atlassian (Jira and confluence), Slack, Postman, Intercom, XML, JSON.

Confidential, Columbus, Ohio

Business System Analyst

Responsibilities:

  • Elicited business requirements for the project using interviews, document analysis, surveys, site visits, use case scenarios and also used standard template of the organization to develop requirements.
  • Conducted Brainstorming and JAD Sessions to gather Business Requirement’s with Regulatory and Compliance SME’s to have a good knowledge of CCAR reporting.
  • Performed Gap Analysis of the features and functionality of current platform version and new version to determine the new functions to be added in.
  • As part of CCAR, worked on Stress Testing solution to check the health of the portfolio of investments through liquidity reporting for the client to ascertain Capital adequacy.
  • Launched EC2 instances with AWS Linux AMI and bootstrapping with Apache using Bash scripting, using auto-scaling and load balancers (ELB). Also, defined Security Groups depending on access parameters provided.
  • Implemented and maintain monitors, alarms, and notifications for EC2 instances using CloudWatch and SNS.
  • Managed multiple relational databases using SQL and Amazon Web Services (AWS) (EC2, S3, Redshift) for processing, cleaning, and verifying the integrity of dataused for analysis.
  • Collaborated with engineering team to standardize analytic methods and improve our analytics pipeline using redshift database and Dynamo DB.
  • Acting as a liaison between businessand technology team and communicating on project scope and objectives across the project.
  • Designed detailed FRD, and Solution Options Document per BRD and Basel regulatory Requirements using Power Designer.
  • Modeled detailed design of the systemusing UML diagrams including Sequence Diagrams, Activity Diagrams and DataFlow diagrams using Smart Draw.
  • Worked with Risk Reporting team to generate new Standard Pillar 3 Reporting framework and distinguish firm-wide global standards as per unique requirements for US site.
  • Extensively involved in Data Governanceprocess. Assisted in datagrouping and defined relationship among the different tables.
  • Assisted architect in creating conceptual, logical and physical models for OLTP, Data Warehouse, Data Vault and Data Mart, Star/Snowflake schema implementations.
  • Analyzed existing datato identify critical dataelements, perform source to target datamapping, elicit datarequirements and converted them into BRD and SRS.
  • Assisted ETL team in creating ETL processes in Pentaho Kettle as per requirements.
  • Used the Pentaho Job scheduler and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Designed and implemented complex SQL queries for QA testing and validate reports.
  • Maintain well organized, complete, and up-to-date project documentation, testing, and verification/quality control documents and programs in compliance with Company standards.

Environment: Oracle SQL Developer, Pentaho Kettle v4.2.0, AWS (EC2, S3, Redshift, Cloudwatch, IAM), Oracle 11g, Atlassian Jira and Confluence, MS project, MS office Suite, MS Project, MS Visio, HTML, CSS, JavaScript, Asp.net, C# .net.

Confidential

Data Analyst

Responsibilities:

  • Gathered business and technical requirements by coordinating with source application BAs, Analytics team, Internal and external stakeholders with focus on Data Warehouse and reporting needs.
  • Ensured the Bank Data exposition and Usage was followed to comply with Corporate Technology standards and regulatory requirements.
  • Performed impact analysis of any source system change on every aspect of DWH and reports by thoroughly understanding the DW design and reports portfolio.
  • Analyzed and gathered requirements for preparation of Business Requirement Document (BRD), Functional Requirement Document (FRD) and System Requirement Specification (SRS)
  • Identified the requirements and arranged peer reviews, walkthroughs and sign off meetings.
  • Assisted the ETL developer to create dimensional model (Star, Snowflake schema), slowly changing dimensions, and rapidly changing dimensions and to implement and validate the model.
  • Worked on Data Modeling, ER Diagrams, creating tables, Facts and Dimensions.
  • Created Data Dictionary to facilitate the mapping from source to destination tables.
  • Worked on developing Mappings, Mapplets, Sessions, Workflows and Worklets for data loads using Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Designed and developed complex mappings in Informatica to load the data from various sources using different transformations such as Pivot, Joins, Expression, Aggregate, Update strategy, Sequence generator, Joiner, Filter, Rank, and Router transformations.
  • Performed SQL validation to verify the data loaded in database tables are in proper format and.
  • Written test cases based on Data completeness, Data quality, Data Transformation, Performance and Scalability, Reporting, Data calculations, Integration testing, Compatibility testing, regression and user acceptance testing.
  • Facilitated User Acceptance testing and created the training to use the reporting for various departments. Reported the defects in JIRA which came up during the UAT and facilitated resolution.
  • Designed, developed, implemented and supported reporting and dashboard visualization through Tableau.

Environment: DB2, Informatica, SoapUI, Flat Files, Adv. Excel, Ms PowerPoint, MS access, Erwin Data Modeler, Tableau, Atlassian (Jira and Confluence).

Confidential

Jr. Database Administrator

Responsibilities:

  • Estimating table sizes, table spaces.
  • Created database objects like tables, tablespaces, buffer pools, indexes and views.
  • Analyzing tables and indexes for performance tuning.
  • Creation, maintenance, backups and restoration of databases.
  • Analyzed and designed relational database for several clients.
  • Created PL/SQL function, stored procedure, package to support reports.
  • Refreshing and manipulating data in tables using PL/ SQL procedures.
  • Monitored and tuned rollback segment, temporary table space, redo log buffer, sort area size.
  • User creation with proper roles and privileges for protection of data.
  • Performed Data migration between databases using both Oracle 11g and 10g export/import utility and Oracle 11g exp/imp data pump.
  • Provide PM with progress/status reports as needed for assigned project plan tasks.
  • Performed various daily RMAN backups and recovery for numerous oracle 10g and 11g databases using physical back (RMAN) and logical backup (export, import and data pump).
  • Worked closely with Application Developers to design/create new Oracle databases which includes creating new schemas/ database objects, as well as making database changes from DEV to PROD, via TST following requests/demands from the development team.
  • Run scripts to check the status of databases such as growing table sizes, extent allocation, free space, used space, fragmentation etc.

Environment: Environment: Oracle 11g, Oracle 10g, Sun Solaris 10.0.

We'd love your feedback!