We provide IT Staff Augmentation Services!

Systems Analyst Resume

Chicago, IL


  • Having 9 years of IT work experience in the analysis, design, development, testing, implementation, and support of database systems in UNIX and windows environments.
  • Proficient in analyzing the business and system requirements including source system analysis, data validation, impact analysis of existing systems and creating the detail requirements and mapping with the consultation of business users and technical architects.
  • Strong experience in business analysis, design, user requirement gathering, analyzing high level and detailed designs, source to target mappings and data transformations and identifying scope.
  • Experience with ETL/Data warehousing in Healthcare, Communication and Retail Industry.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities, Strong SQL coding skills.
  • Extracted data from different sources like flat files, IBM DB2, Oracle, UNIX and SQL Server and loaded into Oracle and Teradata data warehouse.
  • Good experience in performance tuning, query optimization of queries by analyzing the explain plans, recreating the user driver tables by right primary Index, collection of statistics, secondary or various join indexes
  • Good understanding on systems management and monitoring through Teradata Viewpoint.
  • Expertise in UNIX shell scripting and job scheduling via Crontab and also using CA WA Workstation, CISCO tidal enterprise scheduler.
  • Expertise in report formatting, batch processing, import and export data using BTEQ, Fast Load, Multiload, TPT and Fast Export utilities to load data into target data warehouse.
  • Involved in designing and building stored procedures and macros for various modules and experience in trouble shooting techniques, tuning the SQL statements
  • Responsible for writing deployment/release notes before the project release and end user’s manual for production team
  • Performed data validation by Unit Testing, Integration and User Acceptance Testing.
  • Proficient in logical data and Physical data modeling. worked extensively in several projects in both Forward Engineering as well as Reverse Engineering.
  • Well versed with all stages of Software Development Life Cycle (SDLC) i.e. Requirements gathering & analyzing, design, Implementation and testing.
  • Good understanding on agile methodology and the scrum process.
  • Having complete domain and development life cycle knowledge of data warehousing & client server concepts and a good knowledge of basic data modeling.


RDBMS: Teradata 14.10, 13x/12, Oracle (9i/10g), SQL Server 2008, IBM DB2

Client tools/utilities: BTEQ, FastLoad, Multiload, Fast Export, Tpump, TPT, Viewpoint OLE load & Oracle SQL Loader, CA WA Workstation, CISCO tidal enterprise scheduler

ETL: DataStage 8.5. Web Sphere Information Server, SAS 9.1, Informatica Power Center

Languages: SQL, PL/SQL, UNIX Shell Scripting, Teradata SQL

Scripting languages: UNIX Shell scripting, Perl scripting

Operating Systems: Windows 7/Vista/2000/XP/2003, AIX UNIX/ Linux

Other Tools: MS Visio, Erwin, Tableau, Cognos, SSRS, Word, Excel, PowerPoint and Outlook.


Confidential, Chicago, IL

Systems Analyst


  • Worked closely with the Business Analysts and Involved in gathering the business questions and the requirements.
  • Engages with business lead to confirm requirements and priorities throughout the development life cycle as needed.
  • Involved in design discussions and meetings to come out with the appropriate lowest level of grain.
  • Created the detailed Source to Target mapping documents for new enhancements and new project requirements, worked with Business SME’s and documented the Business rules for ETL Development.
  • Work with modeler in developing the logical and physical data model, review the data model and revise based on the requirements.
  • Develop the overall technical approach to design the solution and review with the team and guide them in the development and quality analysis.
  • Provided design inputs and assisted Developers to come up with Optimal ETL Design Solution.
  • Facilitates requirements and logical design discussions, and supports developers in physical design and development hand - off as needed
  • Worked on specifications given by the Data governance team and Data quality team that required managing data from all the business units and ensuring data quality across the enterprise
  • Will involves in build & implements changes within the given estimates. Identify the cause of incidents occurred in Production environment.
  • Raising appropriate change request to implement the fix in production.These change requests are usually planned changes.
  • Worked on enhancing Bteq, Mload scripts to import and export to load data into target data warehouse.
  • Involved in writing complex SQL queries based on the given requirements and for various business needs to be handled.
  • Involved in Unit Testing and Integration Testing and involved in preparation of documents (Technical, Unit test) required for the project.

Environment: Teradata Tools and Utilities, Viewpoint, DataStage 9.1, DB2, SQL Server, Ultra Edit, ASG-Zena Workstation, Cognos, HP ALM, UNIX, MS Visio and Windows.

Confidential, Bloomfield, CT

Teradata Consultant


  • Developed processes to extract the source data from different sources, integrated and populated the extracts after cleansing and transforming.
  • Responsible in interacting with business users in getting the requirements that is necessary to build a warehouse to support application for report generation in an Agile SDLC environment.
  • Proactively and periodically check with the users for issues and upcoming tasks.
  • Involved in designing the physical database objects like tables, Indexes, Views, macros and other database objects on the teradata system maintaining the client specific naming standards.
  • Involved in writing complex SQL queries based on the given requirements and for various business needs to be handled.
  • Involved in writing Korn Shell Scripts for getting the data from source systems to load into target Data Warehouse.
  • Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad, Multiload and TPT
  • Worked on tuning the existing load scripts and process to achieve performance enhancements and reduced load times for faster user query performance.
  • Worked on code remediation to maintain standard process across environments.
  • Involved in setting global environment variables and parameterization to standardize all scripts.
  • Developed Work load automation of batch jobs and to perform scheduling using Crontab and CA Workstation scheduling tool based on the business requirement.
  • Worked on scheduling different jobs to run daily, monthly and quarterly process using CA scheduler to support report generation.
  • Performed release management activities in support of production deployments to achieve the business goals.
  • Involved in supporting AIX Unix to Linux server migration to move legacy tables, objects successfully and testing them.
  • Involved in Unit Testing and Integration Testing and involved in preparation of documents (Technical, Unit test) required for the project.

Environment: Teradata Studio 14.10, Teradata Tools and Utilities, Viewpoint, SAS 9.4, SAS Enterprise Guide 6.1, DB2, Oracle 12, Ultra Edit, CA WA Workstation, Tableau 9.0, IBM AIX, Linux, Rally, MS Visio

Confidential, Ohio

Teradata Consultant


  • Requirements gathering from the business users and analysis, Design, Development and Testing.
  • Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
  • Data Analysis, issue identification and recommend solutions for data issues.
  • Writing Teradata BTEQ scripts to implement the business logic.
  • Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster user query performance.
  • Performs mapping between source and target data, as well as performing logical model to physical model mapping.
  • Wrote appropriate code in the conversions as per the Business Logic using BTEQ scripts.
  • Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Good knowledge over using Teradata viewpoint for monitoring queries and performance analysis.
  • Performed database capacity and forecast analysis for new projects and existing projects,
  • Scheduled jobs for batch processing using Cisco tidal enterprise scheduler.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
  • Developed unit test plans and involved in system testing
  • Worked on complex queries to map the data as per the business requirements.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL.
  • Involved in all phases of SDLC in Agile environment from requirement, design, development, testing, and rollout to the field user and support for production environment.

Environment: Teradata 13.10, Cisco Tidal Enterprise Scheduler 6.1, Informatica Power Center 9.1, Business Objects, SQL Server 2008, Bteq, Fload, Mload, Teradata viewpoint, Erwin, UNIX, Windows

Confidential, Lubbock, TX

ETL / Teradata Developer


  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Worked on loading of data from several flat files sources using Teradata Fast Load and MultiLoad.
  • Transfer of large volumes of data using Teradata Fast Load, MultiLoad and T-Pump.
  • Loaded and transferred large data from different databases into Teradata using MultiLoad and OLE Load and sorted data files using UNIX Shell scripting.
  • Fine tuning of MultiLoad scripts considering the number of loads scheduled and volumes of load data.
  • Worked on Development of Informatica workflows and mappings, BTEQ scripts.
  • Responsible for tuning the performances of Informatica mappings and Teradata BTEQ scripts.
  • Involved in developing Unit Test cases for the developed mappings.
  • Used ETL methodology for supporting data extractions, transformation and loading process.
  • Responsible for preparing ETL strategies for extracting data from different data sources like SQL Server, Flat files.
  • Written Teradata BTEQ scripts to implement the business logic and developed UNIX shell scripts for data manipulation
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data.

Environment: Teradata 13.10 (BTEQ, Fast Load, MultiLoad, Teradata SQL, Fast Export), Informatica Power Center, Business Objects, Erwin, DB2, Oracle, Shell-scripting, UNIX, SUSE Linux, Windows

Confidential, Pittsburg, PA

ETL Developer


  • Requirements gathering and analysis, Design, Development and Testing.
  • Creating and manipulating BTEQ’s, FLOAD, MLOAD scripts for loading data in different layers of warehouse as per business requirement.
  • Identifying long running queries, scripts and resolving bottlenecks.
  • Implementing steps for eliminating duplicates and spool space issues.
  • Optimized job performance by carrying out Performance Tuning.
  • Creating stage tables, indexes according to client instructions.
  • Used Fast Export scripts for exporting to flat files.
  • Responsible for monitoring jobs deployed in production on regular basis for given time period and providing support in case any issues.
  • Written UNIX shell scripts for performing as part of ETL operations, automating warehouse jobs and scheduling Cron jobs in production.
  • Developed UNIX shell scripts for automating warehouse jobs and scheduling Cron jobs in production.
  • Writing unit test cases and submitting unit test results.
  • Documenting solutions to issues, prepare root cause analysis for issues reported.
  • For the defects elimination working with defect track sheets etc.
  • Involved in client meetings, BA meetings, design meetings and production deployment meetings.

Environment: Teradata12, Teradata SQL assistant BTEQ, Fast Load, MultiLoad, DataStage 8.1, Oracle 10g, SAS 9.1, DB2, SQL Server2005, MS SQL, Erwin, UNIX, Shell scripting


Database Developer


  • Extracted data from existing data source and performed ad-hoc queries using SQL.
  • Worked on tables, views, using SQL scripts
  • Experienced in exporting data into flat files in .csv, tab fixed delimiters and etc. formats.
  • Experience is creating deign documents and writing unit test cases
  • Design and developed logical and physical data model of systems that hold 30-50 terabytes of data
  • Worked on exporting data using Teradata fast export.
  • Documentation of scripts, specifications and other processes.

Environment: TeradataV2R6, Oracle9i, SQL, PL SQL, MS Excel, PowerPoint. Windows


Software Engineer


  • Involved in development life cycle. Analyzed business requirement documents to get a better understanding of the system from both technical and business perspective.
  • Provides support for security application and infrastructure-related projects.
  • Resolves technical issues and assists with security incident handling.
  • Checking existing accounts and data access permission against documented authorizations.
  • Maintaining all log-on identifications and data access including specific access to applications, networks, files and database management systems.
  • Runs security analysis reports using custom UNIX shell scripts and documents gaps.

Environment: Oracle 9i, AS-400, Web sphere 6.0, Toad 9.6, PL/SQL, XML, UNIX, Windows

Hire Now