Systems Analyst Resume
SUMMARY
- Fifteen years of total IT experience in various roles of Systems Analyst, Data Modeler, Data Architect Product owner, & Developer
- More than six years working on Data projects (Data warehouse on Hadoop and Master data space (MDM))
- Expertise in Data Analysis, Data Profiling, Data Modelling, Data Migration and Data mapping activities.
- Deep knowledge of Data analysis with hands on SQL skills. Experience on Sybase, Hive, Hadoop, Db2, and SQL Server and Oracle database.
- Extensive experience with creation of Use cases, Functional requirements document, Data flow Diagrams, Data mapping, S2TM (Source to Target mappings), Business rules.
- Expertise in all of the SDLC involving requirement analysis, design, development, testing, and Project management. Experience working on Waterfall model & Agile methodology.
- SAS 9.4 Base programming and SAS®, Certified Statistical Business Analyst Using SAS®9:
- SAFe for Teams from Scaled Agile, Certified scrum master (CSM), CBAP certified from IIBA
- Stibo MDM Solution Fundamentals.
TECHNICAL SKILLS
Data Tools: Informatica MDM, Informatica Data Quality (IDQ), Hadoop, Hive, PL SQL, Pentaho Kettle, Unix, SAS, Python, Tableau
Database: AWS - S3, RDS, Hive, Druid, My SQL, Sybase, DB2, SQL Server 2010, Oracle, Netezza, Teradata, IMS-DB,VSAM, XML
Data Modelling: Teradata CLDM for Telecom; ADRM for Retail, Power center, Erwin
Business Analysis: Requirement Analysis & modelling, Jira, Confluence Balsamiq UI, Integrity APPM, Microsoft tools, iRise, Rally, Quality Center, ClearQuest
Legacy Technology: Cobol, Jcl, Cics, Xpeditor, File-Aid, Idcams, Sort, Changeman, Endevor, MQ, Stored procedures
PROFESSIONAL EXPERIENCE
Confidential
Systems Analyst
Responsibilities:
- Identify Critical data elements, business rules, data rules, work flows and requirement user stories .
- Conduct Data profile identifying data quality, build cleanse rules, conform reference data.
- Create conceptual & Logical data model for Product MDM requirements
- Create mapping from MDM data model to the canonical model
- Identify critical data elements, match & survivorship rules
- Perform data profiling to ascertain data quality & identify enrichment and standardization opportunities
- Create process streamline to address pain points seen in existing process for Product CRUD
- Customize ADRM canonical data model for enterprise product domain
- Create mappings for Commerce, OMS and other downstream systems from the canonical model
- Design the Logical Data Model using ERWIN.
- Define data dictionaries, data & process ownerships from data governance perspective.
- Product owner for Master data Product functionality (PIM).
- Work with Product Manager to define Product Increment and features to be addressed in the PI.
- Translate business requirements for the PI features into user stories.
Confidential, Overland Park, KS
Systems Analyst
Responsibilities:
- PCMD Integrated Data Warehouse: Implementation of Per Call Measurement Data warehouse (PCMD) for network data. The project involves ingesting multiple feeds all through the day involving to 50B records / day and build canned & UI reports after enriching this data with tower, geo & customer information.
- ‘As Is’ process analysis to map pain points.
- Identify and solicit new analytics, reporting and KPIs requirements
- Fact & Dim analysis to create heat map of data used in the enterprise. Conforming data across multiple domains to create enterprise view. Dimension analysis to identify conformance of similar reference data.
- Design ingestion approach (batch, near real time & streaming) to read feeds from different servers using Nifi.
- Create Conceptual, Logical & Physical data model for PCMD application
- Create aggregated models - Cubes in Druid for PCMD data based on report usage patterns and projections.
- Create S2TM for Transactional data and aggregated models.
- Create technical specification for reports to be built from new data model.
- Create Data dictionary for new transactional model and Aggregation models.
- Identify sources, cleansing & initial data load process for Dim tables. Solidify process for future updates.
- Verify enriched data in the target transactional model and aggregated models.
- Assist with UAT for the new system
Environment: Horton works Data Platform - HDP, NiFI, Ambari, Netezza, SQL Server, My SQL, Druid, Spark, Kafka, Tableau
Confidential, Bellevue, WA
Systems Analyst
Responsibilities:
- Integrated Data Warehouse (IDW): Implementation of warehouse of warehouses - Subject area: Party, Product and Digital Analytics
- Implemented database procedures, triggers and SQL scripts for development teams.
- Source analysis to identify the attributes required to get into the IDW model and conformance of such attributes across multiple transactional systems like SAP, Customer Hub (Oracle MDM), and legacy transaction systems.
- Liaising with external digital analytics partners like Adobe Sitecat, Google Analytics, Blue 449, Selligent, Salesforce etc to ingest and enhance customer usage behavioral data into data warehouse.
- Use Informatica data profiling tool (IDQ) to profile source data and identify data quality and data cleansing rules for the new system.
- Create Conceptual & Logical data model for the IDW subject areas from Teradata CLDM model.
- Create Source to Target mapping (S2TM) documentation for the new model.
- Liaise with the business users to identify business rules for the report requirements. Analysis of multiple source systems to identify appropriate data to build these reports for a tactical solution.
- Identify and classify source & target attributes to enforce privacy restrictions based on for Corporate Information security guidelines.
- Evaluated and benchmarked Data quality tools for Big Data profiling from Oracle, Talend & Atacama and provided recommendation for procurements.
Environment: Horton works Data Platform - HDP, AWS-S3, NiFI, Ambari, Hive, Teradata, Oracle DB, Spark, Kafka, Tableau
Confidential
Systems Analyst - Data Modeler
Responsibilities:
- Master Data Management: Implementation of Master data management solution for five data domains - for Organization, Instruments, Person & Common reference data area.
- Gathered functional requirement and Use cases by conducting workshops with Business users and Data Stewards.
- Data analysis and data profiling of existing legacy systems and external vendor feeds and identify the survival sources and trust rules.
- Identifying business rules & data validation rules.
- Creation of data mappings from source system to target MDM data elements.
- Worked on testing data from source systems to target system with focus on trust rules and match & merge processing.
- Data modelling activity to build Logical & Physical data model for the data domains.
Environment: Microsoft Word, Excel, Visio & Microsoft Project, Informatica MDM, Informatica Data Quality, Sybase, Oracle