We provide IT Staff Augmentation Services!

Database Architect/administrator/developer Resume

Gaithersburg, MD

SUMMARY

  • A Database Administrator/Architect/Developer wif strong verbal and written communication skills and solid analytical skills for administering, managing and coding relational database and cloud solutions.
  • A self - starter who is able to work independently or lead a team to a collaborative

TECHNICAL SKILLS

  • Spark on Scala
  • Kafka
  • Flume
  • Sqoop
  • NoSQL DB MongoDB
  • Cassandra
  • RedShift
  • Couchbase
  • PostGreSQL
  • RDF
  • SPARQL
  • jQuery Data Tables
  • Tableau
  • OpenRefine
  • RapidMiner or Solver
  • Eclipse
  • Ant
  • JUnit
  • Apache Tomcat.

PROFESSIONAL EXPERIENCE

Confidential, Gaithersburg, MD

Database Architect/Administrator/Developer

Responsibilities:

  • Advanced experience wif Spark, pySkark, Hadoop framework
  • Implemented appropriate MongoDB indexes (B-Tree, Geospatial, Text) for performance improvement and Developed MongoDB and API prototypes and proofs of concept.
  • Experience wif physical collection creation, access pattern tuning, sharding implementation, index creation, and debugging query execution to obtain optimum database performance
  • Excellent understanding of various MongoDB database design patterns and physical architectures for different use cases
  • Data replication and sharding in a distributed database
  • Shell scripts for Monitoring like ‘slow queries’, replication lag, nodes fails, disk usage etc.
  • Database Health check ( Complete review of Database slow queries, fragmentation, index usage...etc. ) and performed upgrades (Java version, Mongo version, ..etc)
  • Maintained and performed Log rotation/ maintenance ( mongos, mongod, config . etc)
  • Performed segregation of duties (User Management - designing User roles and responsibilities ), Disk usage, CPU, Memory check Alerting
  • Designed DR (Disaster Recovery)/COB ( Continuity of Business) plans as applicable
  • Performed Database Profiling, Locks, Memory Usage, No of connections, page fault
  • Performed Export and Import of Data to and From MongoDB
  • Monitored of Various issues related wif Database, Database Migrations and Updates.
  • Monitored at Server, Database, Collection Level, and Various Monitoring Tools related to MongoDB and Database Hardening.
  • Installed and configured Database software in accordance wif Client defined standards
  • Implemented Mongo Management Service for automating a variety of tasks, including backup/recovery and performance management using MMS and Mongo Profiler
  • Experienced wif Data Modeling using Erwin, ER/Studio, Oracle Designer, Lucidchart
  • Experienced wif document, multivalue, graph and Entity relationship data model
  • Experienced wif NoSQL DB MongoDB, Cassandra, RedShift, Couchbase and PostGreSQL
  • Experienced wif NoSQL wif implementing appropriate indexes (B-Tree, Geospatial, Text) for performance improvement.
  • Implemented Mongo Management Service for automating a variety of tasks, including backup/recovery, performance management using MMS and Mongo Profiler.
  • Designed highly parallelized data ingestion and transformation jobs in Spark including Spark Streaming.
  • Experienced designing and implementing data ingestion and transformation for big data platforms
  • Designed and implement scalable Big Data architecture solutions
  • Provided guidance and platform selection advice for common Big Data (Distributed) platforms
  • Designed data flow and processing pipelines for ingestion and analysis using modern tool sets such as Spark on Scala, Kafka, Flume, Sqoop, and others.
  • Strong hands-on experience wif Cassandra, CQL, data modeling, data replication, clustering, indexing for handling for large data sets. DB Monitoring and setting up Production DB wif Clustering setup
  • Developed Python, Spark, HQL scripts to filter/aggregate data, Scoop to transfer data to and from Hadoop
  • Expertise in managing Cassandra Cluster has Cassandra Admin using DataStax Enterprise Edition
  • Experienced wif data analytics, including relational databases, data warehousing, big data (Hadoop, Spark), business intelligence, NoSQL, and analytics, Azure Machine Learning Workbench.
  • Proficiency wif HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie.
  • Very strong in Core Java, Spring MVC, REST/SOAP Web services, HTML5, CSS, JavaScript
  • Developed and maintain distributed application architecture using MapR, MongoDB and Cassandra.
  • Perform Cassandra backup, recovery, database parameter configuration, prototyping and design.
  • Defined Cassandra data repository requirements, data dictionaries and warehousing requirements.
  • Experience wif JAVA 8, Restful API implementation using Tomcat, JSON and XML data formats/models
  • Collaborated wif cross-functional teams to utilize teh new big data tools
  • Working noledge of Hadoop, Hive, Spark, Spark SQL and Python.
  • Proficiency wif Hadoop Ecosystem particularly Spark, Drill.
  • Expertise in Hive Query Language (HQL) and Hive internals (internal/external tables, partitions, clustering, metastore, etc.)
  • Experienced wif creating Spark/Python scripts to create UDFs to enhance HQL
  • Advanced noledge of Google Cloud Platform, Azure, Oracle Cloud Platform, IBM Cloud and AWS.
  • Experienced wif Oracle, SAP HANA Express, SQL Server, MySQL and Sybase databases.
  • Utilized IBM DB2, Oracle 10g/11g/12c, SQL Server 2017, Netezza and Teradata data warehouse.
  • Proficiency wif Microsoft BI, Tableau and experience wif Business Intelligence Platforms, data visualization research and/or design, analytical best practices.
  • Experienced wif ETL Code deployments (repository, folder, workflow and mapping level).
  • Experienced wif various operating platforms, metadata management, and extensive experience wif SQL and ETL
  • Designed, implemented, managed or orchestrated Docker Container Cluster using Kubern
  • Expert noledge of SQL queries to include ability to create and populate databases, perform data normalization, create query analysis, conduct table indexing, perform multiple and all types of table joins.
  • Built and maintained SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Provided database coding to support business applications using T-SQL.
  • Expert noledge of Tableau (public & desktop), MicroStrategy desktop, QlikView, Microsoft BI, SAP. Experience wif statistical analysis methods
  • Excellent visualization experience utilizing MicroStrategy Desktop, Tableau and Clickview
  • Strong background and experience using Clickview to deliver data-rich projects for clients.
  • Experienced wif big data manipulation / analysis using varying frameworks/tools: SQL, R
  • Strong practical noledge of analytical techniques and methodologies such as machine learning/supervised and unsupervised techniques, segmentation, mix and time series modeling, response modeling, lift modeling, experimental design, neural networks, data mining, Bayesian inference, and optimization techniques
  • Created PL/SQL module which is used to integrate teh existing data from third parties and on to teh DB.
  • Developed / maintained advanced SQL and stored procedures to generate data reports wif PostGreSQL.
  • Performed SQL query performance tuning by creating/modifying indexes, setting transaction isolation levels and changing query structures to reduce table scans and seek.
  • Strong experience in writing SQL queries, creating stored procedures, triggers, functions
  • Ability to create & implement data engineering best practices for teh full software development life cycle, including coding standards, code reviews, source control management, documentation, build processes, automated testing, and operations.
  • Architected, designed and implemented high performance large volume data integration processes, database, storage, and other back-end services in fully virtualized environments
  • Developed and recommended Client and innovative - yet proven and demonstrated - approaches to solving business and technical problem using analytics solutions
  • Designed data structures for ingestion and reporting, specific to use case and technology.
  • Provided data management expertise in evaluating requirements and developing data architecture and refining platform components and design
  • Mentored and guided junior data engineers, shared best practices and performed code reviews
  • Hands-on on SQL Server administration, Replication methods, Always On Availability Groups (AOAG)
  • Hands-on on SQL Server Backups, Restoration, Repair, Performance improvement (me.e Update statistics)
  • Responsible for providing leadership in data management strategies, governance and design.
  • Data architected technical plans that are aligned wif client's mission, strategy, goals, and objectives.
  • Proficiency wif AWS, IBM Softlayer, Microsoft Azure, Google Cloud, Oracle Cloud.
  • Working noledge of Oracle VPD, Data Vault, TDE, Audit Vault, DB security to meet FISMA Standards.
  • Installed and configured Oracle RAC/ASM, Oracle Streams, Data Guard and Data Guard Broker.
  • Developed suitable AWS-based and Hybrid solutions based on customer requirements.
  • Generate AWS migration roadmaps and lead buy-in across complex organizational structures.
  • Used Amazon Elastic Compute Cloud, Amazon Simple Storage Service, Amazon Simple DB/RDS databases, AWS Identity and Access Management (IAM).
  • Cleansed, manipulate and analyze large datasets using Hadoop platform.
  • Managed and implemented data processes including Data Quality scripts
  • Performed R&D and exploratory analysis using statistical techniques and machine learning clustering methods to understand data.
  • Developed data profiling, deduping logic, matching logic for analysis
  • Experience as an engineer doing development wif Java/JEE, JavaScript and other UI technologies.
  • Strong experience in application design, design patterns and performance tuning.
  • Strong experience in HTML5, JavaScript, Backbone.js, Angular JS and React
  • Familiarity wif concepts related to teh semantic web open linked data and RDF.
  • Integrated data extraction techniques from various tools to collect and transform relevant data.
  • Experience wif deep learning frameworks, such as Deep learning 4j, and TensorFlow.
  • Knowledge of stream processing and Couchbase wif Kafka, Cassandra, Redis, Python and Ruby.
  • Expertise using scrum and XP in software development cycle in addition to lean methods and techniques like Kanban and Value Stream Mapping.
  • Knowledge of HBase, MapRDB, MapR FS, Performance Tuning, Cluster Size Estimation.
  • Experience working in AWS Cloud environment highly desired (S3, PostgreSQL, Redshift, DMS)
  • Provided support on Hortonworks DataFlow/NiFi ("HDF”) related issues encountered by Customer teams.
  • Hands-on Probabilistic graphical models using Bayesian Networks and Markov networks.
  • Developed and maintained enterprise solutions using R, Marketing or Risk analytics.
  • Architected hybrid AWS and on-premise solutions for technology clusters and patterns.
  • Commanding noledge and experience writing advanced SQL queries, PL/SQL packages.
  • Achieved clients’ scalability and performance using Oracle Enterprise embedded R execution.
  • Designed and implemented Big Data analytic solutions on a Hadoop platform.
  • Used Bayesian networks probabilistic graphical models to model cause and TEMPeffect relationships.
  • Demonstrated ability to achieve stretch goals in a very innovative and fast paced environment.
  • Demonstrated ability to learn new technologies quickly and independently. Good understanding of operational framework of ITIL Working noledge of Data Quality and Data profiling.
  • Accessed data from a variety of sources, including NoSQL, or API.
  • Performed data aggregations, joins, manipulations, and cleaning to ensure data integrity throughout teh entire analytical process.
  • Applied supervise and unsupervise modeling techniques to data to generate predictions/uncover patterns.
  • Developed hypothesis statements and applies statistical testing to determine causality and generalize observations.
  • Solved problems by applying a variety of predictive modeling/machine learning/deep learning techniques.
  • Maintained a comprehensive understanding of teh current analytical landscape, including emergent technologies and methods.
  • Worked closely wif teh Data Engineering and Product teams to ensure teh success of all projects.
  • Decomposed problem statements into specific deliverables and requirements.
  • Supported teh Product Owners in developing sprint goals and roadmaps.
  • Continuously scan teh Data Science landscape for recent developments and opportunities to integrate new methodologies into teh existing project portfolio.
  • Created presentations and deliver results to colleagues, stakeholders and executive leadership.

Confidential, Norcross, GA

Technical Consultant

Responsibilities:

  • Demonstrated experience wif Big Data structures such as triplestores, RDF, SPARQL, jQuery Data Tables, Tableau, OpenRefine, RapidMiner or Solver. Eclipse, Ant, JUnit, and Apache Tomcat.
  • Developed, designed, tune and maintained SSIS packages to perform teh ETL process.
  • Designed and developed SQL Server and PostGreSQL stored procedures, functions, views and triggers to be used during teh ETL process.
  • Performed data profiling and source to target mappings (while capturing ETL and business metadata) for populating dimensional models.
  • Wrote scripts for automated testing of data in teh target facts and dimensions.
  • Captured audit information during all phases of teh ETL process.
  • Wrote and maintained documentation of teh ETL processes via process flow diagrams.
  • Conducted appropriate functional and performance testing to identify bottlenecks and data quality issues.
  • Delivered a forward leaning architectural blueprint that includes infrastructure foundational elements (me.e. network, storage platform, middleware.)
  • Demonstrated experience wif data architecture, data modeling, database design, and data systems implementation, especially wif Oracle based technologies such as MySQL and Microsoft SQL Server.

Confidential

Technical Instructor

Responsibilities:

  • Taught Oracle 8i/9i/10g Development/Administration, Sybase, DB2, Crystal Reports 8.5/9.0, SQL Server 2000/2005 and Windows 2000/XP.
  • Taught Microsoft Active Directory 2000/2003, A+ Hardware/Software, Network+, Server+, Security+, Microsoft Project, IT Project +, Microsoft Office Access XP and C#.

Hire Now