We provide IT Staff Augmentation Services!

Jr Data Analyst Resume

Mclean, VA

SUMMARY:

  • 3years of Experienced IT Professional with functional and industry experience with accomplished process and project responsibility. Data analysis, design, development, Quality assurance, user acceptance & performance management disciplines using Lean & Agile techniques with varying team sizes inclusive of permanent and contractor resources.
  • Experience in Data Modeling and Architecture, Database Administration, Data Conversion Validation, Data Warehouse Development, Report Creation, Data Conversion, Applications Testing, Software Quality Assurance, User Acceptance Testing, Training and Support.
  • Experienced and knowledgeable in Systems Development Life Cycle (SDLC) like Requirements gathering, Analysis, Design, Implementation and Agile (Scrum) Software Development cycle includes Standup for story updates, Grooming, PI Planning, story creation, creating blockers, Impediments, burning hours and story movement. .
  • Knowledge and Experience in Banking Deposits, credit cards, home loans, and auto Applications
  • Strong Teradata skills that include build/maintain Teradata tables, Views, Constraints, Indexes, SQL & PL/SQL scripts, Functions, Triggers and Stored procedures.
  • Experience in Creating Teradata objects including Volatile Table, Derived Table, Global Temporary Table and Multiset Table as needed for retrieving data
  • Extensive experience on creating adhoc reports using Teradata, SQL Server, BTEQ scripts, Unix
  • Experienced in using Teradata utilities like Tpump, Multiload and Fastload to load the data
  • Worked on performance tuning and optimization to improve the efficiency in script executions.
  • Created and utilized sub queries, views and macros needed as part of the job.
  • Used Aggregations, Set Operators like Union, Minus, and Intersect, CASE Expressions and String expressions to retrieve data in required form from multiple tables.
  • Good knowledge in Tableau Reporting Tool and hands on experience on creating heat maps.
  • Worked on loading data from flat files to Teradata tables using SAS Proc Import and Fast Load Techniques
  • Expertise in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Experience in pulling the adhoc data and delivering data to the customers in different formats like excel, pivot tables and flat files.
  • Experience in Testing the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Experience in Developing scripts using Teradata advanced techniques like Row Number and Rank Functions
  • Experience in creating dashboards, scorecards and building reports using performance point server 2007
  • Experience in Business Objects functionalities like Slice and Dice, Drill Up and Drill Down, Cross Tab, Master/Detail, Formulas and Variables etc.
  • Experience in creating and maintaining security mechanisms for reports, creating repository domains using business object supervisor
  • Extensive experience in Strategic development of a Data Warehouse and in Performing Data Analysis and Data Mapping from an Operational Data Store to an Enterprise Data Warehouse
  • Hands on Experience in Troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models
  • Experience in developing data applications with Python in Linux/Windows and Teradata environments.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Hands on Experience in Developing and maintaining dashboards/reports using Tableau
  • Performed Tuning the queries and Troubleshooting the errors of the campaign flowchart
  • Experienced in conducting GAP analysis to identify the delta between the current performance with the potential performance of the existing software application
  • Experience in Commercial Off - the-Shelf software system implementation, evaluating and selecting COTS solution
  • Exceptional ability to research, analyze and convey complex technical information to diverse end-users at all levels. Solutions-driven strategist who consistently improves efficiency, productivity and the bottom line.
  • Recognized for partnering with business leaders and technical teams to plan, integrate, document and execute complex project plans on time and on budget.

TECHNICAL SKILLS:

ETL & Big Data: Informatica 9.1/8.6/7.1.2 SSIS, Data Stage 8.x, Hadoop, Hive

GUI&Reporting Tools: Business Objects6.5,Brio, Hyperion, Tableau, Unica Affinium Campaign

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

RDBMS: Oracle 11g/10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.x, Teradata V2R6/R12/R13,R14,MS Access 7.0

Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script, Python

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Other Tools: TOAD, AWS, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13 SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Mclean VA

Jr Data Analyst

Responsibilities:

  • Involved in analysis, design and documenting business requirements and data specifications. Supported data warehousing extraction programs, end-user reports and queries
  • Interacted with Business analysts to understand data requirements to ensure high quality data is provided to the customers
  • Worked on numerous ad-hoc data pulls for business analysis and monitoring by writing SQL scripts.
  • Created monthly and quarterly business monitoring reports by writing Teradata SQL queries includes System Calendars, Inner Joins and Outer Joins to retrieve data from multiple tables.
  • Developed BTEQ scripts in Unix using Putty and used cron-tab to automate the batch scripts and execute scheduled jobs in Unix
  • Performed verification and validation for accuracy of data in the monthly/quarterly reports.
  • Analyzed and validated data in Hadoop lake by querying through hive tables.
  • Created reports, charts by querying data using Hive Query Language and reported the gaps in lake data loaded.
  • Good knowledge on Json format data and performed the source, target validations using aggregations and null validity functions.
  • Created multi-set tables and volatile tables using existing tables and collected statistics on table to improve the performance.
  • Developed Teradata SQL scripts using RANK functions to improve the query performance while pulling the data from large tables.
  • Experience in performing Dual Data Validation on various Businesses critical reports working with another Analyst.
  • Designed Marketing Campaigns using IBM Unica Affinium Campaign Management Tool
  • Extracted data for the segmentation process for different channels like direct mail and email using the IBM Affinium Campaign Management Tool.
  • Designed stunning visualizations using tableau software and publishing and presenting dashboards on web and desktop platforms.
  • Designed and deployed reports with Drill Down, Drill Through and Drop down menu option and Parameterized and Linked reports using Tableau.
  • Implement point of view security to Tableau dashboards to facilitate visibility across various levels of the Organization
  • Developed Python programs for manipulating the data reading from various Teradata and convert them as one CSV Files, update the Content in the database tables.

Technical Skills: Teradata SQL Assistant, Teradata, Teradata Loading utilities (Bteq, FastLoad, MultiLoad), Python, Unica Affinium Campaign, Hadoop, Hive, UNIX Shell Scripts, Tableau, MS Excel, MS Power Point.

Confidential, Franklin, TN

Technical Data Specialist

Responsibilities:

  • Responsible for gathering requirements from business analysts and operational analysts, Identified the data sources required for the reports needed to the customers.
  • Used Python programs automated the process of combining the large datasets and Data files and then converting as Teradata tables for Data Analysis.
  • Created an Automated Python Programs to Archive the database tables which large in size and not in use into Mainframes folders.
  • Developed programs with manipulate arrays using libraries like Numpy and Python.
  • Did performance tuning and optimization for increasing the efficiency of the scripts by creating indexes, adding constrains and query optimization
  • Writing SQL scripts for huge data pulls and ad-hoc reports for analysis. Used the Teradata advanced techniques like rank, row number etc.
  • Generated graphs using MS Excel Pivot tables and creating presentations using Power Point.
  • Generated reports using Proc Tab, Proc Report, Data Null, Proc SQL and Macros. Used OLAP functions like sum, count, csum, etc.
  • Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.
  • Created numerous processes and flow charts to meet the business needs and interacted with business users to understand their data needs.
  • Created Set, Multiset, Derived, Volatile, Global Temporary tables to retrieve the data from multiple tables.
  • Experience in writing korn shell scripts for automating the jobs. Automated reports by connecting Teradata from MS Excel using ODBC.
  • Documented scripts, specifications, other processes and preparation of Technical Design Documents.

Technical Skills: Teradata, Teradata utilities (SQL Assistant, BTEQ, Fast Load, Fast Export), Hadoop.

Hire Now