We provide IT Staff Augmentation Services!

Sr Data Analyst Resume

Plano, TX

OBJECTIVE:

Seeking a challenging position that would utilize my knowledge, skills set and experience, and provide opportunities for improvements with new learning leading to continuous improvement

SUMMARY:

  • Overall 8 Years of Experience in Data Analysis, Report Generation, Maintenance of Business Report Processes and Data Verifications and Validations.
  • Experienced with different Relational databases like Teradata, Oracle and SQL Server and cloud based (Amazon Snowflake)
  • Experienced in Interacting with Users, analyzing client business processes, Documenting business requirements, Performing Design Analysis and Developing Design Specifications.
  • Extensive SQL experience in querying, data extraction and data transformations.
  • Extensive Experience in working as Dual Controller on various Business Projects which requires Dual Data Validation and Data Consistency.
  • Experienced working on large volume of Data using Teradata SQL and BASE SAS programming.
  • Strong Experience in working on MS Access, MS Excel and MS Power Point
  • Good experience in Production Support, identifying root causes, Trouble shooting and Submitting Change Controls
  • Experienced in handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements.
  • Proficiency in prioritizing and multi - tasking to ensure that tasks are completed on time
  • Strong consulting skills; ability to communicate, explain and otherwise impart technical information to clients clearly.
  • Demonstrated ability to identify root causes of problems, consider both the long and short-term impact of proposed solutions and develop workable solutions.
  • Ability to manage (Multiple) project tasks with changing priorities and tight deadlines.
  • Demonstrated ability to work well with a wide variety of people at various levels, foster cooperation and collaboration among individuals in the work unit, help team resolve conflicts constructively and ability and willingness to communicate when help is needed.
  • A Self-starter with a positive attitude, willingness to learn new concepts and acceptance of challenges.
  • Good Communication and interpersonal skills.

TECHNICAL SKILLS:

Primary Skills: Data Warehousing, Teradata tools, Teradata SQL, SAS

Languages: ANSI SQL, Teradata SQL, PL SQL, Python, R.

Databases: Teradata, Oracle Toad, SQL Server 2005

Database Utilities: BTEQ, FastLoad, Multiload, Import/Export, SQL Assistant

Scripting Languages: UNIX Korn Shell Scripting, PL SQL

GUI: MS Office Suite, Visual Basic 6.0

Project Management: JIRA

Web: Github,Tortoise SVN, HP ALM, Ab Initio metadata hub, Hammer

Reports Scheduling: Jenkins, Airflow

Cloud: Amazon Snowflake

PROFESSIONAL EXPERIENCE:

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Responsible for making the necessary changes in all the Teradata scripts with the new enhancements in Brand Profit & Loss system for entire global network.
  • Compared the scripts between development, production and QA environments to make sure they are in sync and then make the BPL enhancements in development.
  • All the 64 Teradata objects and 34 shell scripts have been enhanced in Dev and QA environments within the given time frame.
  • Responsible for pushing the code to QA and Production environment using the starteam application.
  • Participated in the uat test phase with various reporting, front end developers and testing team to make sure all the necessary changes are pushed in the given time frame.
  • Responsible to address all the defects raised in HP QC about the Teradata enhancements testing in QA and Prod.
  • Identifying all the reports that are using the planned enhancements tables and guiding the upstream teams to ensure the changes in reports.
  • Supporting the team with hyper care after the Teradata changes moved to production to address any issues.

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Joined the Auto loan servicing team to help on the migration and remodeling of auto data warehouse from legacy systems to amazon snowflake.
  • Identifying all the current reporting columns and tables needed for migrating from Teradata to Amazon snowflake.
  • Working in sprint model to coordinate with data modeler and data engineers for bringing the new tables in to snowflake as per the user requests
  • Developed a crosswalk document that provides the mapping between legacy system to new auto data model in snowflake.
  • Updated the metadata and data lineage documents to get the data risk management approvals for the new modeling tables.
  • Validating the data between legacy tables and newly modeled tables to give sign off on data loads and release to users in production.

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Supporting the team in changing the existing Teradata scripts and reports to point out to the sub servicer Rushmore data for ongoing data.
  • Identifying all the current reporting columns and tables needed for migrating from Teradata to Amazon snowflake.
  • Developed numerous servicing reports as per the business requirement and deployed the code in Hammer for ongoing business purpose.
  • Converted the existing Teradata scripts to pull the OCC audit data to point them to the sub servicer data for ongoing basis and historical data from existing database.
  • Migrated the existing scheduled reports from Jenkins to Airflow by writing Python DAG’s (Directed Acyclic Graph) as per the existing process flow.
  • Converting the Teradata scripts to Amazon snow flake and validate the data before deploying them in to new production environment.
  • Developed a report to identify the 2017 disaster relief areas to identify the customers that were given disaster relief on mortgage payments but they have been reported as missed payments on the credit report.
  • Developed a script to identify the loans that are in repayment plan but borrower didn’t make three consecutive payments.
  • Worked on loading the data from flat files in to Teradata tables and converting them in to business reports as per requirement.
  • Created multiple reports on the daily transactional data which involves millions of records.
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Created Multiset, temporary, derived and volatile tables in Teradata database.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Developed various ad hoc reports based on the requirements
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Performed data analysis on the single-family servicer claims data by writing the SQL queries using Oracle Toad.
  • Involved in performing the data analysis about the number of claims by each servicer, outstanding balance, claims processed and curtailed amount at monthly level by writing scripts at aggregate level.
  • Updated the existing tableau reports with the ongoing data for monthly review of the reports by business stake holders.
  • Developed SQL scripts to build a business data layer with all the historical processed claims.
  • Cross validated the claim level data between existing production tables and reporting data layer with SQL queries.
  • Developed SAS scripts to copy the data from production to sand box for claims data analysis.
  • Developed complex SQL queries to bring data together from various systems.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Created multiple reports on the daily transactional data which involves millions of records.
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Developed various ad hoc reports based on the requirements

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Developed several scripts to gather all the required data from different databases to build the LAR file monthly.
  • Implemented data quality checks on the monthly LAR file to make sure the data is within the federal regulations.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Developed complex SQL queries to bring data together from various systems.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Archived the historical Teradata datasets to UNIX server using sas scripts and created them back to teradata whenever required.
  • Assisted the team for standardization of reports using SAS macros and SQL
  • Developed and modified programs in SAS under Unix environment and wrote Unix Shell Scripts for scheduling jobs
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Created Multiset, temporary, derived and volatile tables in Teradata database.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Validated the data during UAT testing.
  • Performing source to target Mapping
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Helped modelers analyze the different default ratios by converting the flat files in to Teradata tables, updating required information from other sources.
  • Gathered all the required home equity loan origination, account closures and transactional data for OCC sales and practices 2016 audit.
  • Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins to gather the required information for OCC audit
  • Developed a Teradata script to capture all the unnoted exceptions in the home equity originations data which were not captured by under writers at the time of an application.
  • Involved in developing and maintaining Teradata script to capture all the home equity loans that didn’t meet the policy guidelines and reviewed them with underwriting coaches and quality assurance teams monthly.
  • Updated the daily hours spent on each task by creating and updating the tasks on the JIRA Kanban board.
  • Raised several defects and assigned to development team through HP ALM tool.
  • Raised multiple defects with the outcome of test cases and followed up with development team until the defects are fixed.
  • Performed data quality checks on the warehouse to make sure the accuracy of the data with Teradata and sas scripts.
  • Uploaded all the scripts, documents and files used in Federal OCC audit on to GITHUB for future s.
  • Created several Teradata scripts and datasets with SMR data to help business users in analyzing the anticipated sizing for home equity state expansion.
  • Developed Teradata script to create dataset with home equity pricing changes which helps business users in analyzing the pricing blitz in different states.
  • Worked on loading data from flat files to Teradata tables using SAS Proc Import and Fast Load Techniques.
  • Experienced in using Advanced analytical functions
  • Resolving Data issues to improve data quality.
  • Modified the existing mapping template.

Confidential, Plano, TX

Sr Data Analyst

Responsibilities:

  • Developed numerous reports to capture the transactional data for the business analysis.
  • Developed and implemented Data quality checks on the home loans data warehouse tables to improve the quality of source data.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Archived the historical Teradata datasets to UNIX server using sas scripts and created them back to Teradata whenever required.
  • Developed and modified programs in SAS under Unix environment and wrote Unix Shell Scripts for scheduling jobs
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Provided data to support the risk models like CCAR, PD (Probability of Default) and LGD (Loss Given Default).
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Validated the data during UAT testing.
  • Performing source to target Mapping
  • Created Weekly, Monthly and Quarterly Business Monitoring reports on different Business Programs which helps Managers to take Decisions on the Programs.
  • Wrote several Teradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.
  • Created Teradata objects like Tables and Views.
  • Extensively worked on to convert SQL scripts into Teradata scripts
  • Unix Shell Scripting was used for automating logs, for user created table backups and for checking daily log.
  • Created pivot tables in Excel by getting data from Teradata and Oracle.
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Developed Korn Shell scripts to parallel extract and process data from different sources simultaneously to streamline performance and improve execution time in a parallel process for better time, resource management and efficiency.
  • Consolidated Data from different sources to build BI Layer for reporting
  • Providing monthly reports to Loan origination, CRM & default management.

Confidential, Plano, TX

Data Analyst

Responsibilities:

  • Served as home loans subject matter expert for gathering all the required data for Basel II implementation.
  • Worked on gathering all the historical origination, servicing and default data from legacy systems and other acquisitioned banks.
  • Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins
  • Worked on loading data from flat files to Teradata tables using SAS Proc Import and Fast Load Techniques.
  • Worked extensively on Basel home loans model and served as subject matter expert for home loans subject areas.
  • Implemented Indexes, collecting Statistics and Constraints while creating Tables and Views.
  • Resolved Data issues to improve data quality.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Involved in Data Migration between Teradata, MS SQL server and Oracle.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Archived the historical datasets on Teradata to UNIX server using sas scripts and created them back to Teradata whenever required.
  • Assisted the team for standardization of reports using SAS macros and SQL
  • Developed and modified programs in SAS under Unix environment and wrote Unix Shell Scripts for scheduling jobs
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Developed Korn Shell scripts to parallel extract and process data from different sources simultaneously to streamline performance and improve execution time in a parallel process for better time, resource management and efficiency.
  • Served as Subject Matter Expert for Mortgage data assisted in requirement gathering, data modeling and data quality

Hire Now