- Comprehensive knowledge on several business areas, with Telecommunication, Banking and Finance, Medical and Insurance being the strongest.
- Solid Experience in the subject area of Business Intelligence Data warehousing, Data Visualization and ETL process as data analyst.
- Extensive knowledge on Data Warehouse Concepts such as designing and implementation of Data marts with Star, Snowflake schemas and thorough understanding of new age data warehouse concepts like RDM, MDM, ODS, RDWs.
- Worked on variety of tools including Power BI, Tableau, Informatica, Teradata and Microsoft SQL Server Management Studio, DB2 - DataStage.
- Expertise in building complex SQL Queries to perform data mining, data analytics, and data extractions across databases using WinSQL, Putty.
- Experience in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables.
- Designed and Optimized Connections, Data Extracts, Schedules for Background Tasks and Data Refreshes for corporate.
- Specialized in Tableau architecture of designing and developing reports and dashboards. Expertized in features such as Filters, sorts, groups and Sets, reference lines, parameter, calculation field, sets etc. In depth knowledge in mapping techniques and use geo coding to build up visualizations of non - geographic data into a geographical map.
- Azure Application support experiences with Azure PaaS.
- Proficient in writing complex SQL queries, stored procedures, normalization, database design, creating indexes, functions, triggers, and sub-queries.
- Proficient in tuning queries used in the application
- Experience in SAS Administration skills, data warehousing and business intelligence.
- Professional experience of working on different SDLC methodologies such as Agile (Scrum), Waterfall & Rational Unified Process.
- Highly capable of facilitating Joint Application Development & Modeling (JAD, JAM) sessions, Requirements Workshop sessions, user interviews and of acting as a liaison between the clients, Managers, consultants, End users, Developers, QA, and all other stake holders of the project.
- Proficient in preparing Test Scenarios, Test Cases, & Test Artifacts for successful execution of the product.
- Extensive experience in Aspect-oriented programming in Test Driven Development TDD, developing infrastructure framework using Inversion of controls, Dependency Injection Unity.
- Involved in User Acceptance Testing (UAT) and in various testing strategies. Knowledge in software testing, process testing and quality control.
- Excellent written and verbal communication skills. Good at documentation of User requirements and System specifications.
- Preparing data kits using Excel macros and functions. Transitioning documents to template format.
- Gathered requirements, created, and maintained user reports using internal credit system platform.
- Advance techniques in selling and demonstrating features of Windows 10, Microsoft Office, and other Microsoft related products.
Databases: MS SQL Server 2012/2008/2005 , oracle0.9, 11g, MS Access 2007, Teradata 12, Azure SQL DB.
Reporting: Crystal Reports and SQL Server 2000/2005 reporting services.
ETL Tools: Informatica 9.0/8.1, Data Stage, TOAD, Microsoft SQL Server Management Studio, Mysql.
Database Development: Power BI, T-SQL, and PL/SQL, DataStage, Teradata, ETL Informatica PowerCenter.
Office Applications: MS Office (Word, Excel, Visio, PowerPoint, Project)
Operating Systems: Windows 10/2000/XP/2003
Programming Languages: .NET and C++ (READ), VB 6.0, SQL, SAS.
Confidential, Eagan, MN
IT Business & System Analyst
- Responsible for utilizing various delivery methodologies employed by client and works with IT and business stakeholders in the discovery, Identification and documentation of data requirements for development of enterprise data solutions and maintenance of the data eco-system to enable strategic planning. This position serves as a subject matter expert and mentor for junior members of the Data Analyst IT team.
- Interview stakeholders to elicit and document highly complex data needs, related data model changes and data quality and integration requirements.
- Contribute to the design of the data eco-system and data management processes to move the organization to a more sophisticated, agile and robust target state data architecture.
- Create, design, and format technical specification documentation including architecture diagrams, process flows, and other information or processes needed to describe new requirements or changes for development and QA related to data sources, data objects, data transformation, and data movement.
- Responsible for managing user stories, eliciting data solution discovery, definitions and story details to manage sprint deliveries for the solution team.
- Conduct in-depth data analysis using knowledge of data structures, enterprise analytic consumption patterns and PBM data.
- Collaborate continuously with programmers, engineers, analytic communities, data stewards, organizational leaders and other key roles to identify opportunities for process improvements to drive data development.
- Developed data management processes and infrastructure to efficiently source and integrate multiple internal and external data sources to increase value of data to the organization.
- Supported issues, questions and recommended resolution through data domain analysis and clear presentation of findings.
- Critical interaction with team members on a major enterprise initiative leading the management of user stories, eliciting data solution discovery, definitions and story details to manage sprint deliveries for the assigned solution team.
- Create, design, and format technical specification documentation such as process flows, STTs and other information or processes needed to deliver data solutions.
- Experience managing lifecycle of epics/user stories in an agile environment.
- Expertized in PBM data analysis and documentation Experience working with Medical claims data, Member, Provider, eligibility.
- Excellent experience in interaction with external clients, Business, Stakeholder, vendors.
- Deep domain expertise of cloud application development solutions.
- Worked on a Project using Microsoft Access to create a Customer Tracking System to track customers, their status and information. Requirements Gathering and Analysis for the Customer Tracking System.
- Gathered requirements from users and created requirement documents.
- Involved in the administration tasks of Tableau server like installations, upgrades, user, user groups creation and setting up security features.
- Extensively gathered information by conducting walkthrough, workshop, brainstorming sessions and surveys. Identified the business requirements, analyzed and prioritized these requirements that should be included in advantage.
- Worked with database connections, joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Extensively used various techniques like Multiple Breaks, Sorting, Ranking, Report Filtering, Query Filtering, Alters and sections, if conditions while developing and creating the report’s.
- Prepared technical documentation for the reports in the process of migrating from development to production Environment.
- Also worked on querying /pulling data extracts (using SQL Server / MS Access) during the Data Cleansing / Migration phase of the project.
Technologies: Teradata, Toad, SQL Server 2012, Flat Files, Excel source files, XML files, TSQL, SQL management studio, Azure, Agile methodology.
Confidential, Nashville, TN
Data Management & System Analyst
- Analyze business and implementation requirements to enhance ETL applications and its functionality.
- Design, Develop and test ETL (Extract, Transform and Load) data processes using specialized application software like Informatica PowerCenter for populating data into Teradata Databases as per business requirement.
- Involving in all phases of SDLC like system analysis, application design, development, testing and implementation of data mart and data warehouses.
- ETL transformation on Informatica PowerCenter Designer to populate data into staging objects from external sources (Ex: Job run, Load files).
- Responsible for Code Management and implement CI/CD functionality to automate Teradata applications to migrate to various environments.
- Worked on setting up Tableau Server, built up user groups and assigned permission to users. Create update schedules for reports stored in the server.
- Used Tableau Commander for automated reports. Updated Tableau Sever and desktop to version 10.
- Worked on data and records which we receive in various formats in regards of operation information and connect data source to Tableau for visualized analysis.
- Design and Implement strategies to migrate Clinical datasets to Teradata Core System.
- Take end-to-end responsibility of the complete ETL lifecycle in the Organization.
- Supervise and improve performance and efficiency of the ETL jobs in informatica based on the specifications.
- Design and customize the shell scripts for business use, which enhances the performance/throughput of data acquisition to core layers.
- Used SQL Server to retrieve various data, including customer information, equipment information and rental information.
- Communicate with all levels of staff including team members, administrators, business analysts, project managers, leads and business team. Reports progress to manger on a regular basis.
- Extensively Worked with Biostatisticians and clinical data managers to provide SAS program that analyze the data, generate reports, tables, listings, and graphs.
- Used Spring API's for wiring components using dependency injection.
- Work on analysis of requirements and design solutions to meet the business requirements.
- Work in Design proposals following best practices considering data, business, and service layers for all system changes such as defects, Business impacts, work prioritization and estimation.
- Work with Business analysts to understand requirements and create technical specifications.
- Used tableau to create worksheets, dashboards, and stories for analysis purpose, using features such as filters, calculation fields, parameters, geo coding etc. to manipulate data and create advanced type of charts.
- Used Tableau Desktop to create various analysis worksheets, including customer analysis and equipment analysis.
- Assigned visualized worksheets into dashboards and stories and published into server. Setting permissions and views for different user groups.
- Develop Source-to-Target mappings in PowerCenter Designer.
- Develop and Execute unit test plans.
- Develop and Deploy back End Stored Procedures, Functions & Triggers in PL/SQL.
- Development of Informatica mappings with required transformations like Aggregator, Filter, Lookup, Sorter, Normalizer, and Update Strategy etc.
- Involve in troubleshooting of Teradata scripts, fixing bugs and addressing production issues and performance tuning.
- Create, optimize, review, and execute Teradata SQL test queries to validate transformation rules used in source to target mappings/source views and to verify data in target tables.
- Develop various complex mappings using Mapping Designer involving Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Review BRD, Design, FRD Documents provided and Preparing ETL mapping and Unit testing documents.
- Carrying out sanity checks and performing environmental validations on all the required databases.
Technologies: Teradata, Toad, SQL Server 2012, Flat Files, Excel source files, XML files, TSQL, SQL management studio, CMS, Informatica PowerCenter 9.0, Tableau, Agile methodology, Azure.
Confidential, Cincinnati, OH
Teradata Migration Specialist
- Tested the Data Migration projects and executed and verified ETL jobs (DataStage)
- Worked with the following database in Oracle, SQL Server, IBM DB2 and Teradata, extensive backend Testing by developing, modifying, and executing SQL queries for data validation and data verification on database.
- Validated data from multiple sources like Files, XMLs, Databases.
- Excellent verbal, written, and interpersonal communication skills.
- Responsible for ETL processes, Relational Database Design Methods and gave inputs on Good understanding of DataWarehousing concepts, DataAnalysis, Data Warehouse
- Architecture, tools, and testing.
- Developed designing customized interactive dashboards in Tableau using Marks, Action, filters, parameter, calculations, and Relationships.
- Created and modified Dashboards per Tableau visualization features and delivering reports to Business team on timely manner.
- Worked with Selenium/Java automation framework.
- Used the Autofac dependency injection container to simplify the configuration of object-oriented applications.
- Worked on using ALM/Quality Center for defect tracking and Requirements Traceability Matrix.
- Created and updated the written Test Plan, Test Cases and Test Scripts.
- Ensure that proper documentation is captured, maintained, understood by and available to business and/or IT partners.
- Customize Windows 10 image to appropriate line of business.
- Used FactSet and Bloomberg terminal to gather information for analysis, exported data to various formats such as excel, and imported to Tableau.
- Extensively used different Base SAS Procedures like Proc Contents, Proc Print, Proc Sort.
- Generated clinical data summary tables/Listings/graphs and reports using SAS.
- Used Tableau to conduct analysis by using features such as parameters and calculation field. And build up visualization reports based on the analysis.
- Escalate critical technical issues and potential problems as needed to the Project Manager to address any related issues in advance of deadlines.
- Work closely with the various IT departments ensuring that their existing data conversion and integration needs within the system is met and that the integration solution is within the standards and guidelines.
- Proposed any refinements as needed to the existing documentation and standards.
Technologies: Informatica 9.0/8.1, Tableau, Mysql, UI Eclips, METL, SQL Server 2008/2012, Flat Files, Excel source files, XML files, SQL, Agile methodology.
Confidential, Harrisburg, PA
Data & System Analyst
- Responsible for all stages of SDLC (software development life cycle) for multiple enterprises ETL projects within organization, which includes analysis, gathering, document, testing etc.
- Analyze existing data models, flow charts, Use Case description, Sequence diagram, ERD and Mockup screens, source to target ETL mappings, analyzing data sources, existing ETL applications, design documents, Data modeling, Dimensional data modeling, logical/ Physical.
- Designed, ER Diagrams as a part of requirement analysis phase for project.
- Document requirements and create high level (HLD) and low-level design (LLD) documents for ETL project. Create Application Design Document (ADD) capturing architecture diagram, data dictionary and workflow diagram.
- Successfully upgraded Tableau platforms in clustered environment and performed content Upgrades.
- Established Best Practices for Enterprise Tableau Environment and application intake and Development processes.
- Worked with creating loosely coupled classes using Dependency Injection.
- Designing, Developing, and supporting interactive Tableau Dashboard reports. Translate complex requirements into technical solutions.
- Review, enhancements of SAS code for data loading and updating data sources daily and weekly and fixing data loading errors and creating error report to end users on daily basis.
- Monitoring and debug entire heat map scheduling tool as apart of technical analyst and fixing the errors and failed jobs related to heat map tool along with BI tools and data ware house applications and report data flow as per the scheduled run according to daily/weekly/monthly bases.
- Read SAS and.net codes and document current & future data by reading codes (SAS, .Net program), workflows, gather requirements and perform gap analysis on business processes.
- Created SAS datasets from Excel, Oracle database using SAS Macros.
- Used SAS/Access to communicate with Excel and Oracle database and VBA for concatenation of all word files.
- Develop reports using query tool (AQT and data studio IBM) by reading existing SAS and.net program and deliver business insights to take decisions. Convert SAS and.net programs into reports for business questions.
- Responsible for understanding large volumes of data received from multiple workflow systems and experience with data trending, quality assurance.
- Research, analyze and communicate business systems capabilities and processes. Work closely with business unit users (stakeholders) to identify process improvements and provide various options for future systems design that increase corporate efficiencies while considering impact and integrity.
- Leads the requirements definition and gather process for assigned projects. Elicit, analyze, document and manage business requirements and rules.
- Develops and demonstrates a high-level warehouse subject matter expertise in functional areas to bridge business requirements and IT solutions model. Developing and delivering requirement and solution presentations to all levels. Develop, coordinate, lead and teach a variety of business process and systems training.
- Define and document process flows for existing and new warehouse system development.
- Create, manage, and execute application testing plan, procedure, and tests. Promote, build, and strengthen vendor/customer relationships. Develop and maintain department documentation, policy, and procedures.
- Lead and manage projects with business-critical deadlines from inception through implementation. Develop and implement project plans. Define and manage system/project expectations and scope. Lead project personnel to identify and communicate project risks and recommend solutions.
- Create user stories and acceptance criteria for identified features.
- Provide primary and secondary applications support for internal system and users.
- Clearly communicates risks, benefits, goals, and gap while providing an understanding of the impacts so all levels of management can make an information decision.
- Leads, plans, facilitate and conduct meetings related to assigned projects.
- Works with the business to develop effective solutions to meet business needs. Ensure business applications/systems are functioning properly and accessible to users.
- Provides technical direction and ensures compliance with best practice solutions and corporate IT road map.
- Effectively manage work tasks and priorities and performs position responsibilities with minimal supervision as part of support team.
- Applies agile mindset in working through process changes and improvements as well as collaborating with internal and external business partners to gain efficiency for the organization.
- Fixing Network and connection errors between all business application and ETL applications.
Technologies: SAS, DataStage, Tableau, SQL Server 2005/2008/2012 , Flat Files, Excel source files, XML files, SQL, Agile methodology.
- Responsible for developing and implementing data analyses, data collection systems and other strategies that optimize statistical efficiency and quality
- Modify software programs to improve performance.
- Designing and maintaining data systems and databases; this includes fixing coding errors and other data-related problems.
- Analyze data systems and develop scripts to move data from one database to another using informatica software.
- Collaborating with programmers, engineers, and organizational leaders to identify opportunities for process improvements, recommend system modifications, and develop policies for data
- Develop and schedule batch/ Database programs using PL/SQL to validate data and fix data issues and to process ETL mappings without any errors.
- Designing and maintaining data systems and databases; fixing coding errors and other data-related problems.
- Ability to design solutions and conduct data analysis is required.
- Identify ways to improve data reliability, efficiency, and quality
- Complete in-depth data analysis and contribution to strategic efforts.
- Candidate should be able to troubleshoot problem areas and identify data gaps and issues.
- Implement data solutions within a variety of systems/applications.
- Analyzes existing system logic difficulties and revise program logic and procedures to provide more efficient machine operations
- Identify and recommend new ways to improve quality and productivity by streamlining operational processes.
- Use statistical methods to analyze data and generate insightful project reports.
- Use SQL, T-SQL, or other programming languages to prepare and present data in a way that is visually appealing and simple for clients to navigate and comprehend.
- Identify and develop data quality initiatives and opportunities for automation.
- Define end-to-end architecture for data ingestion, storage, processing, and visualization.
- Carrying out data analysis, validation, cleansing, and collection and reporting to support diagnostic software and hardware development work.
SQL Server 2005/2008/2012 , Flat Files, Excel source files, XML files, SQL, Power BI.