EDUCATION:
Bachelor of Science Info.Technology
Master of Business Administration
Masters International Affairs
Masters Computer Science
PhD Systems Engineering (pending)
CERTIFICATIONS:
CCNP/CCDP
CISSP
AWS Solutions Architecture
LEAN
Cyber Security / Cryptography / Digital Forensics
Bioinformatics / Geospatial (detection | classification)
Protocol Analytics / Statistical Modeling.
Predictive Intelligence / Artificial Intelligence / Cognitive Computing.
Blockchain / Decentralized Organization / Decentralized Finance.
Machine Learning / Deep Learning.
Natural Language Processing / Computer Vision.
Quantum Computing / High Performance Computing
Governance / Risk / Compliance / Audit / Assurance.
Web / Cloud Services / Intrusion & Malware Detection.
Hyper-automation / Integration / Virtualization.
Data Management / Big Data / Data Fusion.
ARTIFACT:
Machine Learning Techniques & Ensemble Methods for Advanced Persistent Malware Threat Detection
Abstract— This research offers machine learning techniques and ensemble methods for detecting advanced persistent malware threats. Our research surveys prevalent datasets, reviews certain collection methods, and implements a Malware Detection System employing the MTA-KDD19 dataset using a proposed set of algorithms and classifiers. We identify traffic and resource consumption patterns through data obtained via Intrusion Detection and Antivirus Software to isolate common features deemed relevant for each machine learning technique. Specifically, we focus on behavioral effects from key attributes when malware is fully imbedded within infected hosts and the flow induced by localized events between machines. We acknowledge that data extraction methods, dimensionality of data, feature selection, hyper- parameters, and classification practices can systematically affect results and therefore offer best practices through our advanced persistent malware study. Specifically, we observe the accuracy, training speed, classification for supervised learning by use case. Our focus resides upon the following: Logistic Regression (LR), Linear Discriminant Analysis (LDA), Linear Regression (LINR), Naive Bayes (NB), Decision Tree Classifier (DTCART), K-Nearest Neighbors (KNN), Multilayer Perceptron (MLP), Support Vector Classifier (SVC). Ensemble Methods - Bagging, Adaptive Boost, Gradient Boosting, Stacking, RandomForest. Our research is unique in its use of select machine learning algorithms and ensemble methods applied to a recently produced, signature-based Malware dataset, thus offering new insights into Advanced Persistent Malware Threat mitigation.
ARTIFACT:
Machine Learning Techniques & Ensemble Methods for Intrusion Detection
Abstract— This research offers techniques and ensemble methods for machine learning to improve collective outcome using the NSL- KDD dataset for intrusion detection. We focus on dimensionality of data, feature selection, hyperparameters, contrast of classifiers, and regression practices to explain differences in scholarly research and offer best practice for detecting and mitigating anomalous activity. Moreover, we observe the accuracy, training speed, binary classification for supervised and unsupervised algorithms and disposition their performance by use case. Our focus resides upon the following: Unsupervised Learning - Recurrent Neural Network (RNN) LSTM, Bernoulli Restricted Boltzmann Machine Deep Belief Network (RBDBN). Supervised Learning - Logistic Regression (LR), Linear Discriminant Analysis (LDA), Linear Regression (LINR), Naive Bayes (NB), Decision Tree Classifier (DTCART), K-Nearest Neighbors (KNN), Multilayer Perceptron (MLP), Support Vector Classifier (SVC). Ensemble Methods - Bagging, Adaptive Boost, Gradient Boosting, Stacking.
ARTIFACT: Heterogeneous Processing, Paravirtualization, and Shared Memory Schemes for Cloud Computing
Abstract— Advances in multi-core processing, parallelism, and virtualization have fostered growth in Cloud Computing that offers scale efficiency and elastic demand. Unique workload profiles and migrations place new strains on memory, bus, and IO resources and challenge the limits of traditional computer architecture. This paper surveys integration practices and design constraints. It proposes practical and theoretical alternatives utilizing heterogeneous processing and distributed memory through advanced cache and buffer methods. It considers new cloud benchmarks for memory centric performance modeling.
ARTIFACT:
Resource Dynamics for Intelligent Interconnected Embedded Systems
Abstract— Advancing computing power, memory technology, and learning algorithms present a unique opportunity to revisit well researched practices of resource allocation and scheduling for the embedded system compute domain. Internetwork technologies enable transitive device linking, context aware device communication and ubiquitous convergence on a global scale. It can now encompass cloud infrastructure, virtualized processor cores, intelligent edge services, and a distributed computing model that derives insights from the data generated by billions of devices [22] [38]. These changes alter static process profiles and raise embedded system design questions about prevailing architecture and the dynamics of computing functions [1] [40]. Should resource handling mechanisms change? What are the implications for memory and processing? How will event and scheduling priorities interact with memory and I/O schemes as devices become context aware. This paper builds upon processor scheduling techniques and related real-time variants for embedded systems by incorporating requirements for intelligent interconnected devices with exogenous event driven coordination. This offering could facilitate smart devices, internet of things, multi-core solutions and cloud injunction [21]. The proposal herein aids intelligent embedded devices by improving mechanisms for context aware process adaptation through feedback channels and integrates multivariate control factors into the scheduling mechanism to achieve probabilistic adaptive scheduling [17]. This result mitigates process starvation, mitigates the need for periodic aggregate process bursts, avoids priority inversion and allows processes to migrate between profiles with changes in resource demand. Keywords— Smart Scheduling, Parametric, Weighted Adaptive Multivariate (WAM), MFQ, WFQ, SJF, Neural Network, Machine Learning, Embedded System, Xinu, Cloud, Real-time, Distributed Computing, Probabilistic Modeling, Artificial Intelligence.
ARTIFACT:
Deep Machine Learning Techniques and Detection Methods for Biometric Radiography
Abstract— This research proposes deep learning techniques and computer vision detection methods for Biometric Radiography. The project explores the use of Artificial Intelligence through Machine Learning and Deep learning sub-practices to specifically aid the pulmonology field and generally aid other medical specialties. It applies computer vision through biometric imagery to detect anatomy, to identify opacities, and to diagnose health disparities into multiple classes that indicate medical distress. Images include chest X-rays (CXR) and CT scans with varying degrees of size and quality. Radiography is an essential imaging method for interventional procedures. Radiology, Angiography, Tomography imaging modalities serve to display and localize the state of biological health to detect various traumatic infarctions to observe disease pathogenesis. Biometric metadata observations aid model training and prognosis. Focus resides upon unsupervised machine learning using Convolutional Neural Networks (CNNs) with subsequent performance and tuning. Specifically, Sequential 2D and EfficientNet7 CNNs with transfer learning are built and contrasted. Computer vision applies the Contrast Limited Adaptive Histogram Equalization (CLAHE) method and thresholding via Otsu method.
ARTIFACT:
Computational Pipelining - Survey of Discrete Resource Allocation and Parallel Execution
Abstract— Recent advances in computer architecture that enhance throughput and reduce latency in distributed workloads have brought scale efficiencies to otherwise idle computing resources. This paper surveys the practices of resource allocation from the vantage of pipelining and parallel instruction execution along with the technical implications. It considers workload breadth and depth and implies flow optimization techniques for programming and system infrastructure planning through unit/ component orchestration.
ARTIFACT:
Blockchain Security & Machine Learning for Drone Swarm Control
Abstract— The proliferation of sensor data from the internetworking of things (IoT) has yielded new challenges and opportunities within the mobile computing realm that stem from scalability, reliability, security, and privacy requirements. Though bandwidth, computing power, and data storage have become more ubiquitous, the resource demand, data flow, and security challenges have yielded consistent need for design adaptation with each novel use case. This includes centralized, decentralized, federated, and edge designs. These issues are exacerbated within Interactive Mobile IoT systems, particularly Internet of Vehicles (IoV) and Internet of Drones (IoD). Advances in machine learning and artificial intelligence now enable devices to interact and respond to environmental prompts in dynamic real-time environments. This is called emergent behavior and such capability implies devices may be able to act as a unit and as a group. This creates new challenges for sensor data management and security in current models. Data ingestion, validation, transaction control can all belabor compute resources in new Mobile Edge Computing (MEC) solutions and create Quality of Service (QoS) problems. Lack of data confidentiality, integrity, and authenticity can yield attack vectors that compromise service assurance and therefore mission objectives or public safety. Prominent solutions include custom mobile architectures, advanced machine learning, and Blockchain technologies. However, few integrate all three into a framework for IoD systems. In this research we consider the dynamics of mobile edge computing for IoD with permissioned distributed ledger technology using a Proof of Authority (PoA) consensus model. We explore various distance functions and machine learning classifiers; including K-nearest neighbors (KNN), Neighbors Radius (NR), and Nearest Centroid (NC). We produce two novel datasets simulating drone swarm conflict among opposing groups and deploy static sensor stations. Our detection algorithms register airspace infractions, and our flocking algorithm simulates evade and swarm functionality using path and goal objectives that generate datasets dynamically. Finally, to enhance model performance we center and scale data independently on each feature by computing the relevant sample statistics and we apply principal component analysis to reduce the dimensionality of sensor data.
ARTIFACT:
Cyber Forensics with Deep Learning Recurrent Neural Networks
The Zero Trust Cyber Security Framework has altered axioms traditionally applied to qualify and quantify risk within service, platform, and software as service models. This response is partly attributed to the diffusion of access, authorization, and third-party dependencies now integrated into digital taxonomies, but also geographic distribution, managed services, outsourcing, Internet of Things, and edge computing. Threats can be malicious or innocuous and arise inside or outside a trusted domain, thus a default deny policy is applied. An added level of security has been shown to prevent data breaches, thus affording greater regard for anomaly detection.
This research focuses anomalous patterns observed during real events within the cloud environment and employs the BPF-Extended Tracking Honeypot (BETH) cyber security dataset. The dataset was developed by authors for uncertainty and robustness benchmarking in a skewed, time-series format and was produced from real (non-simulated) cloud computing observations. This experiment leverages a multi-layer recurrent neural network (RNN) for deep learning within a pipeline that examines flows, dynamic OS signatures, and the user/process activity that is hashed/ fingerprinted, condensed, and prioritized for triage.
The goal is to reduce dimensionality in data, preserve time series, and ascertain model viability for detection absent a given scenario in training. This research offers results for Long-Short-Term Memory (LSTM) and Gated Recurrent Units (GRU) algorithms. It expands a previously unpacked dataset attribute and stages a unique approach via Abstract-Feature Analysis (AFA), hyper parameter tuning, and Principal Component Analysis (PCA).
ARTIFACT:
Assessing Functional and System Architecture Generative Processes – A Modeling Framework for Healthcare Compatibility and Matching
The recent global pandemic exhibited fissures in our healthcare system including: adaptability, logistics, supply chain dependencies, modeling deficiencies, and architecture flaws. Had the event been an advanced persistent scenario of bio-terror our system would have been inadequately prepared. A top-down approach to pandemic response helps to improve reaction time and velocity, but can be acutely myopic to change in key performance drivers while lacking adaptability in decision support systems. Part of the solution is a critical assessment of functional architecture alternatives and the generative processes. The other is a means of incorporating and controlling machine learning in a data driven environment, and lastly a bottom up feedback loop to ensure outliers, anomalies, and deviations are associated probabilistically. Risk weighted matching and compatibility is one method that can be adopted to aid symptom, treatment and immunization response through fuzzy associative memory and machine learning.
Bandwidth partitioning for shared distributed networks.
Overcoming the NUMA effect within cloud computing.
Natural language processing for decision support systems.
Advanced cryptography methods for GPUs and SOC designs.
Novel methods for upgrading and securing SCADA systems.
Expansion of biometric modalities.
Data micro-sharding and fusion methods.
Time series correlation and hetroscedasticity for event detection.
Mitigating cyber threats to aid national and corporate security.
Information assurance, digital forensics.
Machine learning / Artificial Intel (statistical theory, and evolutionary methods)
Manipulation of large time series and categorical data streams.
Distributed / Decentralized Computer networks (anomaly detection, optimization).
Governance, audit, risk, compliance, ethics
Field-realizable defensive capabilities and offensive cyber-based systems, including;
Intrusion detection with offensive threat neutralization.
Generative adversarial machine learning networks.
Traceability methods for deep learning.
Data fusion techniques for disparate image sources.
Masking, obfuscation, encryption techniques for dynamic data stores.
Coordination methods for distributed blockchain ledgers.
Advanced tunneling protocols.
Drone swarm control and detection.
SNIFFING & SPOOFING
FIREWALL MANIPULATION
TRANSPORT LAYER ATTACKS
DNS ATTACKS
CRYPTOGRAPHY
MIDDLEWARE ATTACKS
TUNNELING
CROSS-SITE SCRIPTING
INTRUSION DETECTION
VIRTUAL PRIVATE NETWORKING
ENVIRONMENT MANIPULATION
SHELL ATTACKS
RACE CONDITION
DIRTY COW
FORMAT STRING VULNERABILITY
SHELLSHOCK ATTACK
CROSS-SITE REQUEST FORGERY
SQL INJECTION
ANDROID REPACKAGING ATTACK
ANDROID DEVICE ROOTING
Performed comparative analysis with test harness, tuned hyper-parameters.
Multi-Dataset assimilation, performed preprocessing and fusion techniques.
Presented favored solutions from research via papers and presentations for policy consideration.
Used Python, Sklearn, Keras, Tensorflow for modeling.
Models tuned with grid-search and cross validation where applicable.
Employed computer vision for object detection.
Supervised/Unsupervised Algorithms:
Decision Tree
Random Forest
Linear Regression
Logistic Regression
Linear Discriminant Analysis
Naive Bayes
K-nearest Neighbor
Multilayer Perceptron
Support Vector Machines
Recurrent Neural Network and Concurrent Neural Network
Bernoulli Restricted Boltzmann Machines
Ensemble methods included Bagging, Boosting, Stacking
G e o s p a t i a l - T e c h n o l o g i e s :
Solutions that reduce the amount of time spent searching imagery.
Solutions that support the ability to capture objects contained within imagery.
Solutions that can identify and label objects in imagery while suggesting specifications and visual references.
Solutions that support the identification of types of objects in an image using automated AI-assisted human-in-the-loop reinforcement learning processes.