Data Analyst Resume
Chicago, IL
PROFESSIONAL SUMMARY:
- Around 6 years of experience as a Data Analyst with expertise in various phases like Data analysis, Data validation, Data extraction, Data transformation, Data Migration, Data Mapping, Data Governance, Data loading and Data reporting
- Worked on various domains like Banking, Finance, Insurance and Health Care
- Experienced in processing and interpreting large volumes of data using proprietary software, creating image and archiving all associated data
- Strong Teradata skills that include build/maintain Teradata tables, Views, Constraints, Indexes, SQL & PL/SQL scripts, Functions, Triggers and Stored procedures
- Extensive experience on creating adhoc reports using Teradata, SQL Server, BTEQ scripts, Unix
- Experienced in using Teradata utilities like Tpump, Multiload and Fastload to load the data
- Proficient in gathering data from various sources, data profiling, data definition, and load the data to the business warehouse
- Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project's exposure to the forces of change
- Experienced in loading Data Files in AWS Environment and Performed SQL Testing on AWS redshift databases
- Experienced in security assessment and data classification for GDPR Compliance
- Worked on loading data from flat files to Teradata tables using SAS Proc Import and Fast Load Techniques
- Experience in Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC)
- Coordinated with the Business Analyst Team for requirement gathering and Allocation Process Methodology and designed the filters for processing the Data
- Experience in Business Objects functionalities like Slice and Dice, Drill Up and Drill Down, Cross Tab, Master/Detail, Formulas and Variables etc.
- Experience in testing and writing SQL and PL/SQL statements for Stored Procedures, Functions, Triggers and packages
- Experience in developing data applications with Python in Linux, Windows and Teradata environments
- Experienced in retrieving and reporting Statistical Data from MS SQL Server and represent them into Excel Bar Charts and Histograms
- Experienced in conducting GAP analysis to identify the delta between the current performance with the potential performance of the existing software application
- Expertise in Data mining with querying and mining large datasets to discover transition patterns and examine financial data
- Experience in creating various dashboards using Tableau, Excel, and Access with a focus on user interface and simple data consumption
- Involved in creating Dashboards, reports as needed using Tableau Desktop and Tableau Server
- Performed complex Data Profiling, Data definition, Data Mining, validating and analyzing data and presenting reports
- Ability to handle and interact with Business users, collect and analyze business/functional requirements along with excellent communication, documentation and presentation skills with clear understanding of the business process flow
TECHNICAL SKILLS:
Operating Systems: Windows, Windows Server, Linux
Methodologies: Agile, Scrum, Rational Unified Process (RUP)
Data Modeling: Star - Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin
Languages: SQL, PL/SQL, UNIX Shell Scripting, VB Script, Python
Big Data Tools: Informatica 9.1/8.6/7.1.2 , Data Stage 8.x, Hadoop, Hive
Reporting Tools: Business Objects 6.5, Brio, Hyperion, Tableau
Testing Tools: Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest
Relational Databases: Oracle 11g/10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.xTeradata V2R6/R12/R13, R14, MS Access 7.0
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Data Analyst
Responsibilities:
- Created Star schema dimension modeling for the data mart using Visio and created dimension and fact tables based on the business requirements
- Analyze and gather user requirements and create necessary documentation of their data migration
- Develop project management plans, identify key milestones, track metrics and managed stakeholders throughout the lifecycle of multiple projects to ensure adherence to project schedules & budgets
- Used analytic skills to detect debit and credit card fraud in real time
- Moved data from AWS S3 buckets to AWS Redshift cluster by using CLI commands
- Responsible for implementing the Informatica/Teradata CDC logic to Process the delta data
- Created Data Governance leadership council and stewardship framework for ongoing support of business rules and requirements
- Contributed in framing Enterprise Data handling policy - Access control, authentication, data protection at rest/transit, data destruction and data retention
- Created reports, charts by querying data using Hive Query Language and reported the gaps in lake data loaded
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment
- Verifying information with customers and directing them to their next steps
- Developed BTEQ scripts in Unix using Putty and used cron-tab to automate the batch scripts and execute scheduled jobs in Unix
- Extensively worked on flat files, mainframe files and involved in creation of UNIX shell scripts using different shell scripts for FTP, generating list files for loading multiple files and in archiving the files after the completion of loads
- Work alongside clients to develop strategies for migration of their business data across platforms utilizing Microsoft SQL Server
- Developed Python programs for manipulating the data reading from various Teradata and convert them as one CSV Files, update the Content in the database tables
- Assisted with documentation creation for a new fraud system to be implemented
- Developed SQL Server views for generating metadata reports
- Loaded data into Teradata tables from Mainframes by developing SAS scripts.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server
- Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau
- Created Fast Export, MultiLoad, Fast Load UNIX script files for batch Processing
- Involved in error handling, performance tuning of mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data
- Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing and optimization rules
- Involved in migration of the mapping from IDQ to Power center
- Performed tuning and optimization of database configuration and application SQL
- Responsible for migrations of the code from Development environment to QA and QA to Production.
- Documented the existing mappings as per the design standards followed in the project
- Prepared the validation scripts to compare the new data with legacy systems.
- Carry out Defect Analysis and fixing of bugs raised by the Users
Environment: Informatica Power Center 9.x, Oracle 11g, SAS, SQL Server 2008, Python, SQL, PL/SQL, Teradata, UNIX Shell Scripting, UNIX, AWS Redshift, AWS S3, Tableau, MS Excel, MS Access
Confidential, Arlington, TX
Data Analyst
Responsibilities:
- Develop, Organize, manage and maintain graphs, tables and document templates for the efficient creation of reports
- Prepared dashboards using calculated fields, parameters, calculations, groups, sets and hierarchies in Tableau
- Used Python programs automated the process of combining the large datasets and Data files and then converting as Teradata tables for Data Analysis
- Aided in the stand-up of initial Data Governance platform
- Created an Automated Python Programs to Archive the database tables which large in size and not in use into Mainframes folders
- Extensively used data blending, embed functionalities in Tableau
- Deployed Tableau Server in clustered environment by mapping server nodes to primary machine
- Utilized Python programs computerized the way toward joining the huge SAS datasets and Data documents and afterward changing over as Teradata tables for Data Analysis.
- Supported the development of SAS Program for Converting Large volume of Text File into Teradata Tables by importing the text file from Mainframes to Desktop.
- Extracted data from the Mainframe flat files using SAS and creating SAS Data Sets
- Generated server side PL/SQL scripts for data manipulation and validation and created various materialized views, global temporary table
- Worked on scheduling the jobs in UNIX using crontab and sleep functions.
- Worked with Conversion Project team for performing UAT Testing by performing data level validations.
- Administered user, user groups, and scheduled instances for reports in Tableau
- Hands-on development, assisting users in creating and modifying worksheets and data visualization dashboards
Environment: Python, Teradata, Tableau server/Administrator, Agile, UNIX Tableau Desktop v7/v8, SAS, PL/SQL, MS Excel
Confidential
Data Analyst
Responsibilities:
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server along with creating action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau
- Effectively used data blending feature in tableau an defined best practices for Tableau report development
- Administered user, user groups, and scheduled instances for reports
- Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau
- Created side by side bars, Scatter Plots, Stacked Bars, Heat Maps, Filled Maps and Symbol Maps according to deliverable specifications
- Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop & publish them to serve
- Created Custom Hierarchies to meet the Business requirements
- Manage Production, Test and Development environments including installation of patches, security administration and access controls
- Created Workbooks and dashboards for analyzing statistical billing data using Tableau 8.0
- Converted WEBI Reports to Tableau Dashboards for Advanced Visualizations
- Developed formatted, complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports
Environment: Tableau Server/Tableau Desktop 8.x/9.x, Dashboard Designer, Windows Server Standard R2, SQL Server management studio, MS SQL Server 2008R2, Oracle 10g, UNIX
Confidential
Junior Data Analyst
Responsibilities:
- This project was for giving data analysis solution to client
- Data was received in raw format from the business and then sprint team was formed and worked upon handling the data in SQL and process it in Python to be used by the development team
- Communicated client's business requirements by constructing easy-to-understand data and process models
- Analyzed the requirements and segregated them into high level and low level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models
- Collaborated and built relationships with colleagues and clients, thrive in an environment of high-performance customer service and enjoy building/improving operations and processes
- Tried to normalize the data in SQL to remove redundant data and ensure data dependent
- Improved data collection and distribution processes by using pandas and numpy packages in Python while enhancing reporting capabilities to provide clear line of sight into key performance trends and metrics
- Dashboard visualization development using Excel, Tableau and Power BI to visualize big data sets and assist in the analysis of complex, high-volume, high-dimensional and quality data
- Used Tableau to make visual report to make the clients easier to understand
Environment: Tableau, Power BI, UML, Python, SQL, UML, MS Excel