Data Analyst/bi Resume
Columbus, OH
SUMMARY
- Extensive experience in Data modeling and Data Analysis. Strong background in designing Datamarts and Data warehouses, Data Lake.
- SDLC, Scrum & Agile software life cycle development for Business Intelligence. Requirement gathering, Data modeling, creating visualizations and deployment.
- Diverse experience working with multiple Business Intelligence Dash boarding and reporting applications like Tableau, QlikView, Splunk Reporting.
- Strong interpersonal and communication skills. Provided training and guidance to users in complex topics. Visual presentation to junior and senior audience.
- Ability to work in a fast - paced environment while remaining productive and professional.
- Worked on different systems including Financial, Insurance and Telecommunications, Mortgage, Cybersecurity and Scientific Data.
TECHNICAL SKILLS
Reporting: Cognos10.X/8.4,(Framework Manager, Cognos Connection, Report Studio, Query Studio, Analysis Studio), Actuate Reporting suite 10.0
ETL Tools: Alteryx, SAS Data Integration Studio 4.4, Information Map 4.3.1, SqlServer Integration Services(SSIS) 2008/2005, Informatica Power Center 8.x/7.x, Salesforce Apex Data Loader, QlikView Expressor 3
Databases: Teradata, Oracle 11/10g/9i, MS SQL Server 2008/2005,DB2, TOAD, Excel and MS Access 2007, Big Data
Web: IIS, TOMCAT, Netscape, Web logic 6.0
Version Control: TortoiseSVN, Harvest, Documentum, Visual Source Safe, Magic, Test Director, Share point, Git, Jenkins
Dash boarding: QlikView 12.x, Tableau 8.2/9.1/9.2, QlikSense
CRM: SalesForce.com
Others: Splunk, PEGA, Symantec DLP, ArcSight, Cloud, Machine Learning, Python, Security Information and Event Management (SIEM) tool, TigerGraph, Altova XML, ER Studio
PROFESSIONAL EXPERIENCE
Confidential, Columbus OH
Data Analyst/BI
Responsibilities:
- Analyzed, Managed, Maintained & Updated logical and physical data models using a variety of data modeling philosophies and techniques across Multiple Domains.
- Closely partnered with Data Stewards to understand the needs of the business, determine the relationships the models will represent along with helping to identify what relationships and dependencies also need to be reflected.
- Meet with business team/data stewards to generate initial to final Conceptual Model based on data captured by the team.
- Model Entities, Attributes, Relationships and Keys that logically matrix to the conceptual model.
- Converted Conceptual Model to Physical Model using Altova XML tool for both Relational database and Data warehouse.
- Profiled, wrangle, prepared source data and meet regularly with IT partners to develop complete source to target data mappings.
- Analyzed data profiling, data mining, data cleansing, data content, scrubbing data, and translating data via rules from one database to another.
- Define and developed road maps along with business to stream line business data for better utility and management.
- Removed discrepancies existing between the conceptual model and physical Model. Identified future relationships between entities.
- Established and maintained comprehensive data model documentation .
- Used/Managed Data Dictionary to determine data attributes and its usage.
- Build Tableau reports for Data Analysis for Analyst and Upper Mgmt.
- Used custom filters, parameters to define a more robust and user friendly dashboard.
- Wrote SQL query against homegrown Access Database.
- Interact with development, enterprise architecture, business intelligence, and technology team on a regular basis.
- Investigate data quality issues to determine root cause, resolve any data issues and recommended process change to prevent reoccurrence .
- Researched trending technology and build use case to define its best fit for business.
Development Tools: ER Studio Data Architect, Altova XMLSpy, Tableau, Python, Graph Database, Git, Jira, Jenkins
Confidential, Cincinnati, OH
Data Analyst/BI
Responsibilities:
- Analysed data in GuideWire for Policy, Billing and Claims.
- Worked in mapping conversion of data from legacy systems to be imported into Guidewire system.
- Traced data source and defined extraction of data from backend guidewire db to DataHub and Infocenter.
- Prepared mapping documents for HomeOwners Dwelling and Residential Owners.
- Worked in partnership which Dataware house team leads to prepare data movements.
- Used DataDictionary to determine data attributes and its usage.
- Documented metadata for data elements related to business processes, its usage and standards.
- Improved quality and consistency of data there by reducing operational risk by metadata documentation, analyzing process controls and its root cause.
- Researched and evaluate issues related to data quality across multiple lines of business.
- Collaborates with business partners and technology to assess and mitigate data related operational risks in areas of use, security, privacy and quality.
- Participated and initiated in determining functional requirements based on user requirements.
Development Tools: GuideWire, PowerDesigner
Confidential, Columbus, OH
BI/Data Analyst SME
Responsibilities:
- Worked in partnership with Security Assurance Operation (SAO), Symantec Data Loss Prevention (DLP), ArcSight, Attack Analysis, Intel, Fraud team to monitor, identify, analyze and report events.
- Managed and interacted with cross-functional team for requirement analysis, timeline.
- Initiated meeting with other team for identifying opportunity for better process management and excellence.
- Focused on Data management and analysis with an emphasis on problem solving.
- Interviewed candidates for team expansion based on project requirements. Worked along with team members for common goal.
- Provided best practices, solution and recommendation for issues and project related matters.
- Participated and verified requirements and design in meeting related to data migration from different applications to Data Lake, Dataware house.
- Worked along well with end-users to learn about their specific business needs.
- Used JIRA for issue tracking and ticket creation. Worked in Agile form.
- Development and Designed extraction of data form DLP and ArcSight to provide better visualization of incidents to Analyst and Senior Management via Qlikview.
- Massaged data for visualization purposes for Digital Forensics & Analytic Service for data collected from host that included data from system, mobile devices and other endpoints.
- Build, managed and provided security solution in qlikview to lock down visibility and access to application both on shared drive and access portal. Made use of LDAP as and when needed.
- Participated in DLP upgrade process to analyze impact on existing qlikview application and its existing processes..
- Incorporated encryption of backend data for dashboard using cryptoJS on AES algorithm.
- Designed Dashboard for Threat Intel to show trends based on event type like Misuse, Malware, Unauthorized access, Denial of Service, Social Engineering
- Wrote SQL against Oracle DB for adhoc reporting needs.
- Queried splunk SQL for data extraction and report generation as a power user.
- Successfully connected to splunk in qlikview using Rest API connector.
- Used Python to massage and extract data from data files.
- Worked with data streaming out of Data Lake.
Development Tools: QlikView 12.0, QlikView server/Publisher, TOAD, Splunk, PEGA, DLP, ArcSight, Hadoop, Machine Learning, PEGA, RSAM, Python, Erwin
BI/Data Analyst
Confidential
Responsibilities:
- As a BI lead core responsibility included creating data model to facilitate interactive dashboards, data analysis, data extractions and data validation.
- Interacted with senior managers, users to understand their information needs and provide them the right solutions.
- Articulated business users their needs in a way that is meaningful to IT to build and design ETL, custom SQL for reporting and dash-boarding solutions.
- Translated technical spec in business terms.
- Participated in the Data Life Cycle - Planning, Collection, Processing, Management and Distribution.
- Heavily involved in Data Quality Cycle - discover, profile, measure, establish rules, monitor, report and remediate.
- Incorporated measures like uniqueness, timeliness, consistency, accuracy and validity to determine the quality of data. Performed profiling manually via SQL, Alteryx (ETL) and visual inspection, etc
- Used ETL mapping document (Excel) to track flow of data originating from MPX(Mortgage Express) and MAX systems and being stored in Oracle and Teradata mart. Oracle, used storage of current data and Teradata for historical records. Oracle and Terradata used star schema.
- Provided input for enhancement, updated, design of star model for better data management and usability, access and performance.
- Wrote complex SQLs for data exploration, ad-hoc queries, and for analysis. Used Oracle analytical functions like Lead, Lag, FirstValue, LastValue and partitions etc.
- Crossed examined data against both the targeted data source (Oracle and Teradata)
- Identified and fine tuned complex SQL queries, fixed sql issues, identified sql logic to get desired output.
- Converted custom SQL into Alteryx workflow for testing needs.
- Used Business Intelligence tool, Tableau, for designing dashboards for different LOB Sales team, Underwriters, Onbase team. Dashboard included Suspense Report, Loan Status, Loans with Expired Lock, Aging Loans, Application in Process, Loan Activity, Loans in Risk etc.
- Dashboard included Bar Charts, Line Chart, Gantt Charts, Heat Map etc
- Used different functionality in Tableau like calculated column, parameters, sets, expression, data blending, hierarchy, Actions, Group, Bins, Quick Filters, self join etc.
- Published workbook on Tableau Server.
- Reverse Engineered Dashboard to resolve data issue, missing data etc.
- Used and Generated Data Quality Report to capture Data Quality exceptions based on Data Quality rules. Used Exception Report to perform data remediation.
- Participated in metadata life cycle - creating, maintaining, updating, storing, publishing and handling deletion of data.
- Build mock up and presentation of model, extract and load system and report mockup for users and upper management.
- Involved in building process of star schema model for Oracle and Teradata.
Development Tools: Tableau 8.2/9.1, Oracle 11g, Teradata 15.10, Alteryx Designer 10.0, TOAD for SQL
Confidential, Atlanta, GA
Senior BI /IT Lead/ Analyst
Responsibilities:
- As a QlikView solution provider supported various LOBs and departments with Dash boarding requirements
- Managed a team of IT developers for Dash boarding and Reporting needs.
- Given presentations, demos to the users on QlikView functionality. Got users onboard with the QlikView dashboards
- Designed dashboards for consumer complaint, lean six sigma, Incident Management and Chat Sessions
- Analyzed database to get all the required data attributes. Worked on various data sources like Sales Force.com, SQL Server and DB2.Performed data extraction using SQL from different data sources to build dashboards using Sales Force connector, database connection, excel, XML files and Share point List.
- Enforced data modeling best practices. Build QVD architecture for data load. Created data layer, transformation and presentation layer
- Resolved synthetic keys in the model. Used various Load statements viz. Inline, Resident, Preceding CrossTable and so on.
- Used functionality like Apply Map, Match, Peek, AGGR, Bookmark, SET Analysis, Alternate State etc.
- Created multiple tab dashboards. Created Pivot tables, Trend graphs, Bar graphs, Stacked Graphs, Cumulative Line Graphs, containers and so on.
- Created links to access external reports from dashboards
- Implemented data level security by creating Section Access QVW and Binary Load
- Performed incremental load
- Build Timeliness chart, Top Trends in Volume charts, Top 10 reason for chat session chart sourcing data from XML file and used Loop Functionality to extract data.
- Responsible for deploying dashboards on QlikView Server.
- Used SalesForce.com Apex Data loader to Extract/insert/upload data into cloud database for Business needs.
- Build Database tables, wrote ETL process using SSIS to transform Data from different sources such as datamarts, CRM, Flat Files, SharePoint list, others. Set up linked database,
- Wrote VBS script to automate data extraction using Apex Data loader.
- Reverse Engineered existing model
- Migrated and build reports in Tableau upon acquiring Tableau as a new dash boarding tool.
Development Tools: QlikView 11.0, QlikView server/Publisher, Sales force.com, Apex Data loader, SAS EG5.1, SAS Data Studio MS SQL Server 2005/2008, SQL Server Management Studio, SSIS, Oracle 11g, TOAD, Tableau