Data Architect Resume
Chicago, IL
PROFESSIONAL SUMMARY:
- 10+ years of IT experience with extensive database administration/development using DB2 on Unix, AIX, Linux, Windows, z/OS including Bluemix/Softlayer and MS SQL Server on - premises and Azure, experience with administering Oracle, MySQL, PostgreSQL.
- Extensive experience supporting DB2 system administration, application development, resolving DB2 issues, performance tuning, monitoring data security & integrity. Installing Base code/Latest Fix packs and maintaining DB2 environment including creating instances, databases, buffer pools, logical and physical design of database storages.
- Expertise in implementing DB2 HADR with multiple stand by servers, HACMP, Replication, Federation, backup & recovery Strategies with multiple data centers and vendors, developing Architecture for building database by using tools like Erwin, ER/Studio. Setting up Always-on, Clustering, Multi instances, Migration to Azure on SQL Server.
- Expertise in setting up & managing servers/instances/databases in Softlayer (Bluemix), Azure & AWS on LUW.
- Versed in DB2 In-place and Side-by-side migration of instances and databases. Expertise in migrating databases between vendors (DB2, Oracle, SQL Server) using tools like IBM MTK, SQL Server Migration assistant, SQL Developer.
- Database utility experience including re-org, runstats, rebind, export, import, load, bulk loads, data pump, DB verify, auto-load, db2move, db2look, backup, RMAN, restore and roll forward utilities.
- Supervisor experience handling team of 8, effective communication capabilities and being able to work productively with multilevel personnel.
- Experience with ETL tools Informatica, DataStage, SSIS and reporting tools like Business Objects, SSRS, SAS enterprise.
- Database monitoring and tuning experience using Solar Winds toolset, IBM Data Server Manager, DB2’s Performance Monitor, Detector, Omegamon, Snapshot & Event Monitors, db2 audit, Explains, Index Advisor, Spotlight, Foglight, BPPM, SCOM & custom scripts writing kornshell scripts and using AWK, SED, GAWK.
- Worked Extensively on features of LDAP, DB2 SSL, TSA Setup, SMTP setup, DB2 Client, DB2 Connect, IBM WAS, ISAM, Java/J2ee applications and user issues concerning connectivity to application and databases.
- Follow IT Service Management (ITSM) based on ITIL best practices using tools Remedy, Cherwell, Track-it, ServiceNow, Salesforce, JIRA, MESD, GoToAssist while working on incidents/tasks/changes and successfully resolving issues.
- Worked with Third party utilities Quest Tools, BMC Tools, Nagios, DBArtisan, Toad, Performance Expert, Spotfire, Idera SQLSafe, NetBackup, DBeaver, DB Visualizer, Solar winds DPA & tool set etc.
- Proficient in Scheduling/Maintaining jobs using CRON, Autosys, NetBackup scheduler, Windows Scheduler, SQL Server Agent, TSM and TWS Scheduler based on application requirement and provided 24x7 on call support for databases supporting online applications and off-hours maintenance.
TECHNICAL SKILLS:
RDBMS Latest Version: DB2 11.1, SQL Server 2016, Oracle 12c, Sybase, MySQL, PostgreSQL
Database Tools: DB2 Client toolset, IBM DS, IBM DSM, BMC & Quest Tools, ERWIN, Toad, VERITAS NetBackup, SSMA, SSIS, SSRS, DataStage, Cognos, Business Objects (BOXI R2), Informatica (ETL), SAS, IBM MTK, SSMA, CA Tools for z/OS, SPUFI, File-AID, IBM OMEGAMON, Detector, Visual Explain, Optimization Service Center, BMC LogMaster
Cloud: Bluemix (Softlayer), Azure, AWS.
Application Server: WAS, Apache Tomcat Server, CICS
Operating Systems: UNIX, AIX, Linux, z/OS, Windows
Scripting/Languages: .NET Framework, JCL, COBOL, Shell Scripting (Bash, Korn), DB2 SQL PL, T-SQL, SQL/PL, C
Schedulers: Cron Jobs, Autosys, Windows Scheduler, Informatica Scheduler, ESP
PROFESSIONAL EXPERIENCE:
Confidential, Chicago IL
Data Architect
Responsibilities:
- Collaborates with cross-functional team in support of business case development and identifying modeling method(s) to provide business solutions. Determines the appropriate statistical and analytical methodologies to solve business problems within specific areas of expertise.
- Document alldatamapping and transformation processes in the Functional Design documents based on the business requirements and queried the databases, wrote test validation scripts and performed the System testing.
- Owned and managed all changes to thedatamodels. Createddatamodels, solution designs anddataarchitecture documentation for complex information systems.
- GeneratingDataModels using Erwin9.6 and developed relational database system and involved in Logical modeling using the
- Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
- Guide the full lifecycle of a Hadoop solution, including requirements analysis, platform selection, technical architecture design, application design and development, testing, and deployment
- Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
- Createddatamodels for AWS Redshift, Hive and HBase from dimensionaldatamodels.
- Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multi-datacenter and multi-rack Cassandra cluster.
- Define Big Data strategy, including designing multi-phased implementation roadmaps and handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS
- Manage timely flow of business intelligence information to users and Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
- Involved in making screen designs, UseCases and ERdiagrams for the project using ERWIN and Visio.
- Analyze the Business information requirements and research the OLTP source systems to identify the measures, dimensions and facts required for the reports.
- Performed Datamapping between source systems to Target systems, logicaldata modeling, created classdiagrams and ERdiagrams and used SQLqueries to filter data
- Lead design of high-level conceptual and logicalmodels that facilitate a cross-system/cross functional view of data requirements
- Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File
- System and PIG to pre-process the data.
- Maintaining conceptual, logical and physicaldatamodels along with corresponding metadata.
- Done data migration from an RDBMS to a NoSQL database, and gives the whole picture for data deployed in various data systems.
- Designed and developed thedatadictionary and Metadataof the models and maintain them.
- Involved in DataWarehouse Support - StarSchema and Dimensionalmodeling to help design datamarts and datawarehouse
- Developed LINUXShell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezzadatabase.
- Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools.
- Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated with the project using Pl/SQL
- Prepared documentation for all entities, attributes,datarelationships, primary and foreign key structures, allowed values, codes, business rules, glossary evolve and change during the project
- Exported the patterns analyzed back to Teradata using Sqoop.
- Troubleshoot test scripts, SQL queries, ETL jobs, and data warehouse/data mart/data store models.
- Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.
- Developed complex Stored Procedures for SSRS (SQL Server Reporting Services) and created database objects like tables, indexes, synonyms, views, materialized views etc.
- Developed the performance tuning of the Oracle database by using EXPLAINPLAN, TKPROFutilities and also debugging the SQLcode.
- Perform analyses such as regression analysis, logistic regression, discriminate analysis, cluster analysis using SAS programming.
- Excellent understanding of an Approach to MDM to creating adatadictionary, Using Informatica or other tools to do mapping from sources to the Target MDMDataModel
- Used Meta data tool for importing metadata from repository, new job categories and creating new data elements.Environment: DB2, SQL Server, Linux, IBM Data Studio, SAS, Toad, IBM Data Server Manager, IBM Rational, Trackit.
Environment: Erwin r9.6, Oracle 12c, SQL Server 2016, Hive, AWS, S3, AWS Redshifit, NoSQL, Teradata, Netezza, PL/SQL, MS-Visio, Informatica, T-SQL, SQL, Crystal Reports 2008, Java, HDFS, PIG, SSRS, SSIS, Metadata, SQL.
Confidential
Sr. Operations & Maintenance Lead
Responsibilities:
- USAID GHSC PSM project integrates two former USAID programs into one efficient supply chain that serves many of the world’s most vulnerable and difficult-to-reach communities. The project is designed to meet today’s critical global health challenges.
- Support Linux, DB2 database servers, B2B Sterling Integrator, IIB, Cognos Reporting and DataStage Servers on IBM Softlayer Cloud and bare metal servers.
- Currently providing DB2 LUW DBA V10.5 implementation support for a world-wide supply chain application. Install and upgrade DB2 and related components and Streling products.
- Created 25+ DB2 instances and all DB2 objects. Configure DB2 automatic maintenance for on-line backups and RUNSTATS.
- Installed and configured IBM Data Server Client V10.5 on the IIB servers, cataloging nodes and the monitoring remote databases.
- Setup and maintain test and prod HADR. Installed DB2 V10.5 FP-5 and FP-7, IBM Application Developer Tools, Data Server Manager for monitoring, database alerts and health checking.
- Create Linux user ids and groups. Grant group level DB2 privileges and assign users to groups.
- Use NMON to monitor performance and generate reports. Add storage to filesystems and regular System Engineer tasks.
- Generate various List Reports, Grouped Reports, Cross tab Reports, Chart Reports, Drill-Down and Drill-Through Reports.
- Generate Complex reports in Cognos 10.1 report studio including Drill Down reports from DMR modeled Frame work model.
- Manually modified the SQL in ReportStudio to tune and/or to write complicated reports. Used union/Join objects in ReportStudio.
- Performance Tuning and Optimization of Parallel Jobs and Server Jobs.
- Using Parallel Extender to load data into data warehouse with different techniques like Pipeline and Partition in MPP environment.
- Experience in UNIX/Linux Shell scripting as part of file manipulation, and have strong knowledge in scheduling Data
Confidential - Indianapolis, IN
Sr. Production Hosting DBA
Responsibilities:
- Managing DB2 databases for application CHESSIE (Maryland’s Children Electronic Social Services Information Exchange), which manages tracking child welfare data statewide. Also manage Rational application for which is hosted on DB2.
- Managed database servers SQL Server (50+ instances with Always on & clustering) and DB2 (HADR) hosting applications connected to MyDHR. Apart from Maryland DHR also supported databases hosted by Indiana, ELIS, Oklahoma & Illinois applications/projects.
- Manage CHESSIE application database development, making sure all the stored procedures are coded in an optimal way. Develop objects as necessary to accommodate development.
- Create application specific scripts to recover test database environments from Production environment for DB2/SQL databases on AIX and Windows (Batch scripts).
- Used tools like IBM Data Studio and client tools to Develop and validate Stored procedures and user defined functions.
- Setup and manage SQL Server 2014 databases with Always on High availability with multiple secondary nodes.
- Constantly refresh QA/DEV and production reporting systems with Production data as per the requirement of client and adhoc testing for successful implementation of development.
- Monitor expensive queries using EXPLAIN tables and running visual explain utility on SQL statements for optimal performance.
- Constantly update database and instance level configuration parameters to keep up with deprecated parameters and analyzing the best settings required for the database to perform well.
- Use VISIO/Erwin Data Modeler for developing the architecture and the ERD diagrams.
- Use tools like DB2 Performance Monitor, Snapshot Monitor, Event Monitor, Explain, Visual Explain, Index advisor to find the bottlenecks. Extensively used DB2TOP for monitoring databases both in Interactive and Batch modes.
- Provide 24X7 database support for applications going live or in maintenance mode with on call rotation.
Environment: DB2 UDB 10.1/10.5 Linux, Windows, SQL Server 2012/2014/2016, DB2TOP, IBM Data Studio, Service Now, Control Center, Spotlight, Crontab, CHESSIE.
Confidential - Sacramento, CA
Sr. DB2 Systems DBA/ SQL Server/Oracle DBA
Responsibilities:
- Managing DB2 databases to support ERP application LAWSON, which manages the whole Confidential hospital system, Payments system, Human Resource system and Supply Chain management.
- Confidential has around 1000 SQL Server database servers, 50 DB2 servers and around 50 Oracle servers - where all the maintenance of the servers are handled only by a group of 3 initially and later team grew to be 8.
- Analyze and setup HADR between databases, also configured read only standby databases.
- Manage Multi-Terabyte database environments supporting different applications on DB2 with the databases sizing ranging from 100GB to 20TB.
- Install DB2 V9.7 & V10.5 on our new servers, migrated multi-terabyte databases from V9.1 to V9.7 & 10.1 and also upgrade hardware from P5 machine to IBM P8 machine depending on the application demand.
- Administered large 64-bit DB2 v9.x & v10.5 LUW databases in a 24x7 enterprise SAN based multi-data center environment, including installation, configuration, backup, recovery and tuning.
- Migrate file systems from 32GB LUN’s to 256GB LUN’s for ease of server administration with no downtime, where databases have huge number of table spaces and file systems.
- Create POC and educated application teams to take advantage of Automatic maintenance, STMM and Monitor collect settings enhancements available in DB2 V9.7.
- Create Health check scripts using the new SYIBMADM views and MON GET Functions and educated non DBA teams to handle performance issues.
- Create scripts for data maintenance activities like REORG, RUNSTATS, LOAD, UNLOAD, IMPORT and EXPORT, BACKUPS (Incremental or FULL) and scheduled jobs using Maestro pointing the backups to tape.
- Create application specific scripts to recover test database environments from Production environment for DB2/SQL Server and Oracle databases on UNIX, AIX, Linux and Windows (Batch scripts).
- Enable STMM on DB2 Servers, Setup start to end SQL Replication between transactional and reporting environments & design the best possible solution for transactional logging and implemented.
- Installing SQL Server Standard and enterprise edition and baseline the servers according to the organization standards.
- Created SQL Server health check stored procedures/Jobs for easy maintenance of databases upon customer request.
- Provided 24X7 database support for applications going live or in maintenance mode with on call rotation.
Environment: DB2, Linux, AIX, Windows, SQL Server, Oracle, IBM Data Studio, TOAD for DB2, Spotlight for Oracle, SQL and DB2, Quest Central, BMC Remedy, Replication Center, Spotlight, Load Runner, Idera SQLSafe, Maestro, crontab, LAWSON, NetBackup, Cherwell.
Confidential - Sacramento, CA
DB2 LUW Application DBA/Developer (z/OS, LUW)/Performance Analyst
Responsibilities:
- Designed & Developed DB2 databases for the Application Single Client Database (SCDB & eApply Modernization) in Confidential .
- Converted IDMS to DB2 on z/OS (SCDB) which is the largest conversion project & Designed database for eApply Modernization project for the claimant’s to apply UI and DI online.
- The databases are highly transactional with huge amount of data, Designed for optimal performance where databases ranging from 1TB to 5TB.
- Administered large 64-bit DB2 v9.5 and v9.7 LUW databases in a 24x7 enterprise SAN based multi-data center environment, including installation, configuration, backup, recovery and tuning.
- Used VISIO/Erwin Data Modeler for developing the architecture and the ERD diagrams.
- Used tools like IBM Data Studio, Development Center (V8 Client) to Develop Stored procedures.
- Used tools like Snapshot Monitor, Optimization Service center, OMEGAMON, CA Platinum tools, Event Monitor, Detector, Explain, Visual Explain, Index advisor to find the bottlenecks.
- Worked on data maintenance activities like REORG, RUNSTATS, LOAD, UNLOAD, IMPORT and EXPORT, BACKUPS (Incremental or FULL) and scheduled jobs to run with ESP Scheduler.
- Generated Database Dictionaries using TOAD, Created Reports and extensively experience working on SOFTBASE - a third party tool to provide users the same functionality as IDMS Database.
- Reviewed the architecture and changed it to get best performance for the application. Worked with Load/Performance/Stress testing teams to figure out the bottlenecks where we see high responsive times.
- Worked mostly on Identifying optimal indexes for effective retrieval of data and Created batch processes according to the business requirements to update data.
- Used tools like IBM Data Studio, Development Center (V8 Client) to Develop Stored procedures.
- Used tools like Snapshot Monitor, Detector, Optimization Service center, OMEGAMON, CA Platinum tools, Event Monitored, Explain, Visual Explain, Index advisor to find the bottlenecks.
- Created EXPLAIN tables and running visual explain utility on SQL statements
- Used Cron-tab/ESP to schedule the batch processes.
- Query tuning for the problematic queries identified during database monitoring.
Environment: DB2 UDB 9.x Linux, AIX, Windows, DB2 V9 NFM (z/OS), DB2 V8 (z/OS), IDMS, JCL, COBOL, REXX, IBM Data Studio 2.2.1, Quality Center, TOAD for DB2, Quest Central, TFS, Load Runner, Development Center (DB2 V8 Client), Visual Studio, VISUAL Explain, Optimization Service Center, Load Runner, OMEGAMON, CA Platinum Tools, BMC Tools.
Confidential - New York City, NY
Sr. DB2 Database Administrator/ Developer
Responsibilities:
- DB2 Database Administration on Linux, UNIX and Windows including audit configuration, SQL Tuning, Resource monitoring and performing security architecture review.
- Proficiently designed Agile and Waterfall Models, including projects totally depending on Federation, Replication.
- Created POC (Proof of Concept) for projects with new security levels where data is driven by Web services.
- Constantly migrated the instances, databases to current level of DB2 versions/fix packs to in corporate the new functionalities. Migrated to V9.5 from V9.1 and also to V9.7 where business demands.
- Coordinated with Individual Business owners for Implementing STMM on all shared servers. Designed system storage structure and planning future storage requirements (design and analysis of file systems, logical volumes and raw containers)
- Database Backup & Recovery (offline & online Backups - Full, Incremental & Delta Incremental). Database cloning using Redirected Restores.
- Evaluated tools like Tibco Spotfire, Performance Expert for company’s internal use.
- Worked mostly on handling Federation server issues. Creating Wrappers, Server definitions, Nicknames, Federated procedures from Sybase, DB2, Oracle, SQL Server, all kinds of files.
- Created strategy on accessing different sources through DB2 Federation having different character sets enabled on source servers.
- Handling issues through falcon tickets opened by developers - Organization wide. Apart from Falcon tickets, worked on 4 (SalesGPS, ResearchGPS, TradeGPS, Risk Reward) projects as a member of development team creating Stored procedures, Functions, Tables, Views, Triggers, Federated Objects.
- Worked closely with Load Testing, Performance testing teams to narrow down and fix the unexpected response times on the applications. Managed database Security, Creating and assigning appropriate roles and privileges to users depending on the user activity using LDAP.
- Used Perforce for Versioning the database objects with each release of the application.
- Deployed Web Services on Secured Servers through which database will be accessed through URL’s.
- Worked on creating transformations and Mappings using Informatica power designer and processing tasks using workflow manager to move data from multiple sources into DB2.
- Extensively worked on performance tuning creating optimal indexes to meet SLA’s.
- Used tools like DB2 Performance Monitor, Snapshot Monitor, Event Monitor, Explain, Visual Explain, Index advisor to find the bottlenecks. Extensively used DB2TOP for monitoring databases both in Interactive and Batch modes.
- Query tuning for the problematic queries sent by application developer or queries identified during database monitoring.
- Extensively used Multi-Dimensional Clustering and Materialized Query tables (both system and user maintained) to enhance system level performance.
- Planned/Scheduled/Created JIL files and uploaded the autosys jobs based on application requirement.
- Developing tools using Shell/AWK scripts to automate various database administration jobs.
- 24x7 on call support on issues with production databases with High priority falcon tickets.
Environment: DB2 UDB 9.1/9.5/9.7, AIX, Linux, Windows, Informatica v8.6.1, BO XI (Business Objects), IBM Data Studio, Apache Tomcat Server, Federation Server, Quality Center, Quest Central, DBArtisan, Perforce.
Confidential - Indianapolis, IN
Sr. DB2 DBA/Oracle DBA/SQL Server DBA - Migration Specialist
Responsibilities:
- Developed and Maintained Production DB2 UDB databases on AIX and Linux Servers for MO State Government for the department of DOLIR (Department of Labor and Industrial Relations) and SQL Server on Windows Servers for LA State Government.
- Responsible for software installation, database creation and maintenance, performance monitoring and tuning, backup and recovery, customer consultation, DBMS infrastructure creation and support, problem determination/resolution.
- Installed DB2 UDB on AIX, Linux and Windows. Created instances and databases on different servers and provided connectivity to development and production databases
- Upgraded DB2 UDB instance from 32 bit to 64 bit and installation of latest Fix Pack.
- Migrated Oracle databases to DB2 databases using IBM MTK
- Created stored procedures, functions that cannot be migrated using MTK and compile, bind them to the databases.
- Developed Architecture for building a database by using tools like Erwin, ER/STUDIO.
- Designed Database to utilize the Row Compression and built row compression for the Tables.
- Determined project performance requirements, and plan for database backup & recovery, infrastructure, capacity, and health maintenance of databases.
- Run db2ckbkp utility to test the integrity of the backup image and check whether it can be restored
- Automated backups (both online and offline), Reorgs, Runstats, Space monitoring, moving db2diag.log, monitoring size of tablespaces by setting up a threshold.
- Worked on IBM WebSphere Application Server to create and add Data sources to integrate databases with application.
- Worked closely with Developers to create database objects (tablespaces, schemas, tables, views, indexes, triggers, stored procedures etc) as per the requirement.
- Created Range- partitioned Tables and specified ranges Creating Tables and Indexes.
- Tuned database and Database manager configuration for optimum performance. Experimented with changing buffer pool and table space page size to improve time taken by ETL to build data marts.
- Created MDC and MQT tables to improve query performance and reduce the overhead of data maintenance operations.
- Establish the environment to store XML data in DB2, Set up XQuery to retrieve documents from an XML column and create XML indexes for efficient retrieval of XML data.
- Setup HADR/HACMP between primary and secondary servers to ensure High Availability.
- Worked heavily with data movement utilities IMPORT, LOAD, EXPORT and Quest Central under high volume conditions to move data from Legacy systems.
- Remotely administered the production database 24/7 through VPN connection. This includes monitoring the database growth, checking for the performance bottlenecks, taking backups for disaster recovery and tuning SQL queries for better performance.
- Created and Configured Alerts, define counters and thresholds for alerts.
Environment: DB2 UDB 8.x/9.1/9.5, SQL Server 2005, Oracle 10g, AIX 5.3, Linux, Windows Server 2003, ERWIN, ER/Studio, Quest Central, WAS server 6.0, IBM MTK, SSMA, Service center
Confidential - Moline, IL
DB2 DBA/Infrastructure Analyst
Responsibilities:
- Installed Product base codes, Fix Packs (full and alternate) and upgraded instances/databases regularly.
- Installed multiple DB2 copies on the same server to ensure the ability to run independent copies of DB2 products for different functionalities.
- Performed both In-place and side-by-side Migration of instances and databases from v8.2 to v9.1.
- Implemented LOGARCHMETH1, LOGARCHMETH2 & FAILARCHPATH to enable archival logging and Implications during recovery situations.
- Actively Monitored file systems growth, added additional disk space for volume groups, created new file systems and allocate file systems to new databases.
- Used db2greg to remove DAS, DAS Service, Instance, DB2 service registry entries from Global Registry while uninstalling older versions of DB2.
- Installed and Upgraded DB2 Client on Servers/Workstations, Implemented DB2 Client on Desktops using SMS push and configured access to ODBC data-sources.
- Installed DB2 Connect product base code, installed Fix Packs, created Instances, catalogued z/OS sub systems on Gateway servers to allow mainframe access and used BigIP for Load Balancing gateways.
- Used various third party monitoring tools like Nagios, Quest Central v 5.0, BMC Tools, DBArtisan etc., and configure alerts for defined thresholds.
- Expertise in SQL tuning, DB/DBM parameter tuning, different levels of memory allocations, various tools like export, import, load, autoloader and experience with all kinds of database snapshots.
- Recovered data to a prior point in time using ROLLFORWARD recovery.
- Setup Web Sphere Federation server to access many relational and non-relational data sources.
- Setup Federated Database environment between Databases by creating servers, wrappers, nicknames, User mappings and activated federated health indicators.
- Well versed with stored procedural SQL Programming on development and test environments using DB2 Development Centre/ Data Studio Administrator
- Responsible for performance tuning, 24 x 7 support, analysis and resolution of database problems that occurs during the critical time processing.
Environment: DB2 UDB 8.x/9.x, AIX 5.3, Linux, Quest Central, BMC Tools, SQL Replication, ER/Studio, DBArtisan, WSFS, DB2 Connect V 8.x/9.1, DB2 Client, BMC Remedy