We provide IT Staff Augmentation Services!

Sr. Data Architect/analyst Resume

DallaS

SUMMARY

  • 14+ year experience in Telecom Industry with global companies in architecting, leading, designing and developing large - scale, enterprise wide solution for processing multi-terabyte data across multiple applications.
  • Strong understanding and experience in Data Analysis and dealing with high volume/complex structured and unstructured data structures
  • In depth knowledge of Waterfall and Agile methodologies and thorough understanding of various SDLC phases- requirements gathering, design, development, testing, release and support.
  • Expertise in authoring requirements documents like Business Requirements (BR), Functional Requirements (FR) and Requirements Traceability Matrix (RTM) documents
  • Expertise in SQL and Performance tuning on large scale RDBMS (Teradata, Oracle etc.,) Database
  • Expertise in building end to end solution using multiple technologies like PL/SQL, J2EE, Unix Script, Python, Spark, Hive
  • Actively involved in Quality Processes and release management activities - To establish, monitor and streamline quality processes in the project.
  • Very good in Analytical thinking, decision making and problem-solving skills.
  • Involved in review meetings with Project Managers, Developers and Business Associates for Project
  • Ambitious, self-motivated, ability to work independently as well as in teams, possess multi-tasking skills, results oriented engineering professional.
  • 14+ years of experience in the software development industry, with an emphasis on full life cycle software development using best practices.
  • 14+ years of Revenue Assurance / Data Quality experience, including Data Analysis, ETL, Reconciliation, and Data Conversion.
  • 10+ years in custom application development using PL/SQL, SQL, Shell scripts, awk Script, J2EE, Teradata.
  • 8+ years of experience designing and developing custom, integrated, enterprise wide Revenue Assurance/Data Quality solutions.
  • 7+ years of management experience. Teams ranged in size from 5-15, including off-site and offshore resources.
  • 6+ years of Data Modeling and Architectural experience including Physical & Logical data modeling.
  • 3+ years of Big Data Experience on processing high volume of data using Spark, Hive, other Big Data technologies, Python.
  • Helped to design and develop multiple revenue assurance reconciliation/audits resulting in Millions of revenue savings and customer experience improvement for Confidential
  • Involved in design and development of automated process for automated XML generation for Bespoke ETL tool saving multiple man days on manual configuration for Confidential
  • Involved in design and implementation of Big data based Revenue Assurance Solution for Telekom Malaysia resulting in revenue saving
  • Pivotal in design and development of automated Network Inventory correction solution taking 2G, 3G, 4G Logical Objects for Confidential helping in better network planning.
  • Key member Involved in design and development of Golden Record for Account, Device, Telephone Number across Telecom OSS / BSS systems and correction of data in downstream systems resulting in over Millions of revenue saving and improved customer experience
  • Pivotal in design and deployment of new Fraud Management system at Talk Talk

TECHNICAL SKILLS

Platforms: Windows, GNU/Linux, HP-UX, Microsoft Azure

Operating System: Windows 9X/2000/XP/NT, MS-Dos, HP-UX, Unix, IBM AIX 4.3/4.2 and Solaris

Programming Languages: SQL, Shell Scripting, awk Scripting, Core Java, Spring, Hibernate, Python, Hive, Spark, React JS, HTML, CSS, Servlets, jQuery, Java Script, RESTful API, JSON, XML, JSX, AJAX, JSP

Databases & ETL Tools: Teradata V2R4/V12/ V13/V14, Oracle 12c/11g/10g/9i, SQL Server 2005/2000/7.0/6.0 , MS Access Data, MS-Visio, Star Schema, Snow Flake Schema, Extended Star Methodology Schema, Physical and Logical Modeling. Sqoop, Mongo DB

Teradata Database Utilities: Teradata SQL Assistant, FastLoad, MLoad,FastExport,Tpump, SQL*Loader, Export/Import

Reporting Tools: Qlik Sense, iReport

Other App Utilities: SQL *plus, Stored procedures, Functions, Exception handling

Methodologies (SDLC): Agile and Waterfall

Packages: MS Visio, MS Project, MS Office (SRD, BRD etc.,)

Tools: Eclipse, Jupyter Notebook, Jenkins, Talend, Toad, Postman, Spring Tool Suite, JIRA, Confluence, Kibana, GIT, Visual Studio, Swagger UI

Application Server: Tomcat, JBOSS

PROFESSIONAL EXPERIENCE

Confidential, Dallas

Sr. Data Architect/Analyst

Responsibilities:

  • Collaborate with business users, stakeholders and subject matter experts (SME) to understand the requirements at techno-functional level.
  • Convert Business requirements to Functional requirements, Functional to System requirements.
  • Create requirement traceability matrix to link back the test requirements to Business requirements.
  • Analyse multiple source systems and define data extraction mechanism, methodology and frequency
  • Development of Spark application for reconciliation process
  • Performance improvement using PL/SQL for existing reconciliation and audit processes
  • Development of application for extraction of data from Teradata and loading the same in Oracle
  • Design and development of XML generation application in Java for using in other ETL tool
  • Design and Development of Java Spring application for reconciliation and data processing
  • Design and define application to reduce repetitive tickets raised by end users
  • Worked on new initiative for using AI/ML to identify new areas/audits across Revenue Assurance
  • Co-ordinate with offshore development team to identify priorities and update scope and delivery schedules.
  • Performance optimization of transformation jobs for processing high volume of data
  • Identify new audits points and correction methodology based on data analysis across multiple applications which resulted in improved customer experience and reduced in bound call centre calls
  • Used elastic search (Kibana Logs) to Identify promos and offers presented to customer based on application logs
  • Performed API Calls to get real time customer information for building decision engine pivotal in adding missing offers/promos to customer orders
  • Validate the data after ETL to RDBMS data ingestion to Lake from RDBMS and files based on business logics.
  • Perform RESTful API Calls using Oracle PLSQL
  • Involved in transformation of legacy front end application to J2EE based architecture (Java Springboot, React JS, Swagger)
  • Designing and developing the End Points (Controllers), Business Layer, DAO Layer using Hibernate/JDBC template, using Spring IOC (Dependency Injection).
  • Designing and implementing of restful web services and these endpoints are configured to produce and consume based on the requirement.

Confidential

Sr. Data Architect/Analyst

Responsibilities:

  • Collaborate with business users, stakeholders and subject matter experts (SME) to understand the requirements at techno-functional level.
  • Gathered user requirements, analysed source systems and source data, created functional and technical designs, performed complex process design for Hadoop based solution
  • Analyse Source systems and define data extraction mechanism, methodology and frequency.
  • Define business rules to perform Data Profiling
  • Define normalization and reconciliation processes to perform data integrity checks and identify discrepancies
  • Design transformation jobs to create golden record (master data) at subscriber level by collating data from various BSS systems
  • Define detection rules and metrics to identify and prioritize revenue leakage points
  • Solution Blueprinting for all controls of Managed Accounts and Mass Market Subscribers across Fulfilment and Billing processes
  • Analysis of discrepancies and Exception for defining correction approach in respective source systems
  • Define check points for incremental data processing and correction updates in source systems
  • Performance optimization of transformation jobs for processing high volume of usage data
  • Define KPIs and reports based on data analysis, data and process flows
  • Setup data lake and implantation of data extraction in to data lake using Kafka, Sqoop
  • Implementation of reconciliation/audit process using Spark and Hive
  • Define end to end data lineage and data governance policies
  • Set up Data virtualization layer to be used by downstream reporting tools like Tableau, QlikSense
  • Involved in data Migration from existing legacy application to new Big Data based application
  • Involved in design and development of front end application using Spring, Hibernate, CSS, XHTML
  • Provide wire frames using traditional tools like Excel, Word, Paint to the development process
  • Worked with different File Formats like TEXTFILE, ORC, and PARQUET for Spark/Hive querying and processing
  • Experienced in working with spark eco system using SparkSQL and Scala queries on different formats like Text file, CSV file.
  • Worked with NiFi for managing the flow of data from sources through automated data flow.

Confidential

Solution Architect/Project Manager

Responsibilities:

  • Convert Business requirements to Functional requirements, Functional to System requirements.
  • Create requirement traceability matrix to link back the test requirements to Business requirements.
  • Design & define metrics, attributes & filters for each requirement.
  • Perform data analysis, gap analysis and risk analysis between existing systems/applications and document the analysis.
  • Create application design and high-level design based on customer requirements on RDBMS/Shell Script/J2EE application.
  • Understanding the flow of usage records from network to billing and then charting out the probable leakage and control points
  • Analyse source system and data to define transformation and reconciliations to identify data integrity issues and discrepancies
  • Design transformation and reconciliation processes.
  • Analysis of discrepancy and working with different BSS team for root cause identification and defining data correction of approach
  • Co-ordinate with different BSS team like Billing, CRM for data correction
  • Define and develop KPIs, Reports and Dashboard on Jasper (iReport)
  • Manage Team at offshore team

Confidential

Solution Architect

Responsibilities:

  • Understanding of data model for Logical Objects (2G, 3G, 4G) in Network and Inventory
  • Participate in discussion for requirement gathering for optimal reconciliation coverage across Telenor Mobile Network and Inventory system.
  • Design of configurable network adaptor capable of reading multiple format xml file for extraction of logical objects from Network Management System
  • Design Logical Data Model combing multiple object to single Object
  • Define data extraction mechanism from Network Management System and Inventory for logical objects
  • Design normalization process for Network attribute and Inventory attributes
  • Define and design process / rules for correction of data in Inventory and validation of Network data
  • Analysis of discrepancies and co-ordinate with BSS (Inventory) team for correction of data
  • Create Solution blue print document for development team
  • Define and development java based application for automated correction in BSS (Inventory) based on discrepancy type
  • Define and develop audit and correction workflow using Java, Spring, Hibernate
  • Development Reconciliation/Audit process using Shell Script/PLSQL
  • Define and develop KPIs, Audit reports, Correction reports and dashboards in Java, JSP, Hibernate based on framework
  • Developed process using Java 1.5, Java bean, SAX handler, which parses network data from XML file to be used in reconciliation/audit process.
  • Worked Close with QA team to define and implement test strategy and approach
  • Worked with deployment team to plan cutover strategy and Go-Live

Confidential

Data Quality Consultant/Analyst

Responsibilities:

  • Analysis of various BSS elements, identify non- synchronous data, design reconciliation and correction flow to correct data in downstream systems.
  • Handled and executed project with 10+ members from end to end.
  • Co-ordinate with different BSS teams like Billing, Inventory, Data warehouse, Off-shore and Near-shore teams for understanding Confidential BSS, data flow and sprint deliverables
  • Closely working with business analyst team to understand the detailed solution requirements and business expectations
  • Design and develop golden record creation for Account, Device, Telephone Number across OSS / BSS systems
  • Design reconciliation processes for processing high volume of data from various source system in cluster environment
  • Define check points for incremental data processing and correction updates in source systems
  • Define correction approach for auto correction of discrepancies in source system via API, web-services, SQL Script.
  • Develop process using Unix Shell Script/awk Script/PLSQL/Java to apply transformation rules
  • Develop process to maintain Master data for Account, Device, Telephone Number to be used and referred by other downsteam systems
  • Adhere to ETL/Data Warehouse development Best Practices.

Confidential

Techno-Functional Lead

Responsibilities:

  • Setting up new Fraud Management System replacing existing Legacy Fraud Management system
  • Co-ordinate with different teams like Intrinsic product team, Intrinsic configuration team, Talk Talk OSS/BSS teams for implementation of FMS
  • Work with Fraud Analyst in translating business requirements into Functional Requirements Document and Detailed Design Documents
  • Involved in gathering requirements, designing and development of configuration web pages and business logic for Fraud Management Application
  • Design queries for Dashboard and reports across multi-dimensional, multi-level drill-down-enabled reporting package and that provides a holistic view of Fraud management KPIs to an operator
  • Performed and presented data manipulation and data analysis for complex data.
  • Develop and schedule data extraction from multiple sources using direct db load and file based extract
  • Used error-handling strategy in all crucial steps of transformations and sending mail notifications, logging errors to an error table.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
  • Collaborated with Database Administrators, Developers, and Analysts on all aspects of data management.

Confidential

Technical Lead

Responsibilities:

  • Setting up the end-to-end revenue Assurance activities, which includes identification, execution, analysis, correction and protection RA, processes for GPRS service
  • Analysis of various GPRS event charging scenarios
  • Involved in design activities such as High-level design, Low level design of project.
  • Defining the revenue calculation approach for prepaid / post-paid subscribers
  • Validation of revenues calculated by application with Mobily RA teams
  • Automation of revenue calculation processes for prepaid / post-paid subscribers
  • Define & develop reports and dashboard using iReport (Jasper)
  • Designed and Developed DataMart’s using Star Schema and Snow Flake Schemas.
  • Used XPath to navigate through elements and attributes in an XML document.
  • Used XQuery to query XML data like SQL for database.
  • Worked with Workflow and Rules Engines.
  • Written complex SQL queries using Joins and Views.
  • Used SVN for version control.

Confidential

Revenue Assurance Analyst

Responsibilities:

  • Requirement analysis and technical design
  • Responsible for the design, analysis and development of the backend using UNIX (Shell scripting), CLOVER-ETL, PLSQL.
  • Analysis of Ordering & Provisioning/Activation flow for various wireless and wire line services provided
  • Identify potential leakage points in the OSS/BSS landscape.
  • Developed GUI using JSP, JavaScript, which interacts with back-end framework-components to pass the user requested data.
  • Developed framework component, Spring Controller using Spring Framework1.2, Which represents view part of MVC and responsible for populating JSP pages.
  • Developed Persistence classes and XML configuration files for Hibernate mapping to database.
  • Performed the integration testing with other applications and bug fixing.
  • Used Log4j to implement logging facilities.
  • Configured and Deployed application on WebSphere Application Server.
  • Involved in the understanding and designing of the complex back-end framework.
  • Developed Java Beans (DAOs), which contains complex SQL queries to perform CRUD operation in the database.
  • Participated in the Code reviews.
  • Helped team members to setup environments/configuration
  • Arranged regular status report meeting with the team and with the Client.

Confidential

Developer

Responsibilities:

  • Requirement analysis and technical design
  • Involved in writing SQL/PLSQL procedures, cursors for handling the complex issues.
  • Followed MVC pattern to designed reusable framework components
  • Followed J2EE design pattern to develop framework components-- ‘Business delegate’, ‘Service locator’ and DVO.
  • Used Struts1.1 as Web-framework.
  • Developed GUI screens using JSP, JavaScript and Struts tags.
  • Written complex SQL queries.
  • Helped tune the database to resolve performance issues and ensured high performance through periodic performance evaluation and improvements.
  • Developed test cases for unit testing of JSP pages.
  • Participated in Code reviews.

Hire Now