Table of Contents Table of Contents
Previous Page  172 / 561 Next Page
Information
Show Menu
Previous Page 172 / 561 Next Page
Page Background

INFORMS Nashville – 2016

172

2 - Degradation Prediction Of Printed Images

Ziyi Wang, Rutgers University, Piscataway, NJ, 08854,

United States,

ziyiwangcumtb@gmail.com

, Elsayed A. Elsayed

Today, a great number of images are produced by digital color printers, especially

inkjet printers. Many factors lead to the degradation of such images and accurate

prediction modeling of the degradation is of interest. Previous research that

addresses image degradation usually measures the density loss or color change of

the prints. In this presentation, the area coverage of the Neugebauer primaries for

the basic four-colors (CMYK) ink-set is estimated from the spectral information of

the print. A degradation model is developed to predict the area coverage loss over

time. A numerical example is used to illustrate the proposed approach.

3 - Modeling Spatio-temporal Degradation Data

Xiao Liu, IBM T.J. Watson Research Center,

liuxiaodnn_1@hotmail.com

This talk presents a modeling approach for an important type of degradation data,

i.e., the degradation data collected over time and from a spatial domain. The

connection between the proposed model and traditional pure time-dependent

univariate stochastic degradation models is discussed, and an application example

is provided.

MB68

Mockingbird 4 - Omni

Joint Session QSR/DM: Data analytics for system

improvement I

Sponsored: Quality, Statistics and Reliability/Data Mining

Sponsored Session

Chair: Kaibo Liu, University of Wisconsin-Madison, WI,

kliu8@wisc.edu

Co-chair: Haitao Liao, University of Arkansas, Fayetteville, AR,

liao@uark.edu

1 - Kernel Fisher Discriminant Analysis For Uncertain Data Objects

Behnam Tavakkol, Rutgers University, Piscataway, NJ,

btavakkol66@gmail.com

, Myong K Jeong, Susan Albin

Uncertain data problems have features represented by multiple observations or

their fitted PDFs. We propose measures of scatter for uncertain data objects which

include covariance matrix along with within and between scatter matrices. We

also propose Fisher linear discriminant and kernel Fisher discriminant for

classifying uncertain data objects.

2 - An Efficient Statistical Quality Control Scheme For High-

dimensional Process

Sangahn Kim1, Rutgers University, Piscatawy, Piscataway, NJ,

sk1389@scarletmail.rutgers.edu

, Myong K Jeong, Elsayed A.

Elsayed

As the number of quality characteristics to be monitored increases in those

complex processes, the simultaneous monitoring becomes less sensitive to the

out-of-control signals especially when only a few variables are responsible for

abnormal situation. We introduce a new process control chart for monitoring

high-dimensional processes based on the ridge penalizing likelihood. The accurate

probability distributions under null and alternative hypotheses are obtained. In

addition, we find out several theoretical properties of the proposed method, and

finally demonstrate the proposed chart performs well in monitoring high

dimensional processes.

3 - A Nonparametric Adaptive Sampling Strategy For Online

Monitoring Of Big Data Streams

Xiaochen Xian, UW-Madison, Madison, WI,

xxian@wisc.edu,

Andi Wang, Kaibo Liu

Modern and rapid advancement in sensor technology generates huge amount of

data, posing unique challenges for Statistical Process Control. We propose a

Nonparametric Adaptive Sampling (NAS) strategy to online monitor non-normal

big data streams in the context of limited resources, such that only partial

observations are available. In particular, this proposed method integrates a rank-

based CUSUM scheme that corrects with the anti-rank statistics due to partial

observations, which can effectively detect a wide range of possible mean shifts in

all directions when each data stream follows arbitrary distribution. Two

theoretical properties of the NAS algorithm are investigated.

MB69

Old Hickory- Omni

Military Operations Research II

Sponsored: Military Applications

Sponsored Session

Chair: Natalie M Scala, Towson University, 8000 York Road, Towson,

MD, 21252, United States,

nscala@towson.edu

1 - A Value Model For Cybersecurity Metrics

Natalie M Scala, Assistant Professor, Towson University, 8000 York

Road, Towson, MD, 21252, United States,

nscala@towson.edu,

Paul L Goethals

This research applies decision analysis perspectives to cybersecurity and creates a

value model for performance metrics and best practices that is supported by

industry data and interviews with subject matter experts. The utility-theory based

value model will include attributes and values, score metrics on their contribution

to value, and provide a rank ordered list of important metrics and best practices

for implementation. We illustrate the value model but contribute an overall

framework that can be customized for any organization. Results will enable

organizations to assess the performance of cyber systems.

2 - Efficient Benchmarking Tool Regarding Optimal Detection Of

Critical Components In A Network

Gokhan Karakose, University of Missouri,

gkz7c@mail.missouri.edu,

Ronald McGarvey

Many mathematical and heuristic approaches have been provided to assess critical

components of the network based on the network connectivity metric. Since

examined objectives through this metric (e.g. minimum connectivity) have

important values in many areas (e.g. immunization), proposing an effective

solution framework to determine optimal values of such objectives is crucial. In

this regard, we provide efficient mathematical models along with new valid

inequality constraints to further decrease computational complexity compare to

the most recent best models. With this improvement, we broaden the application

scope of the exact solution method for the determination of critical component.

3 - OMEGA: Evaluating Effectiveness Of Proposed Systems Using

Bayesian Networks

Freeman Marvin, Innovative Decisions, 5848 Hunton Wood Drive,

Broad Run, VA, 20137, United States,

ffmarvin@innovativedecisions.com,

Amanda Hepler

OMEGA is a new approach for designing affordable systems architectures that

meet user needs. OMEGA uses a Bayesian network of probability distributions

that describes functional needs, system capabilities and customer satisfaction.

Measures of Effectiveness (MOE) are combined to estimate the probability that a

proposed system will meet mission needs. Additionally, OMEGA can “back cast”

the system requirements necessary to achieve alternative levels of mission

effectiveness. This innovative approach was developed by a collaborative team of

requirements engineers and decision analysts. OMEGA is a flexible, low cost

approach for conducting architecture trades and developing requirements for any

kind of system.

4 - Designing An Objective Metric For Evaluating Army

Unit Readiness

Paul Goethals, United States Military Academy, West Point, NY,

United States,

paul.goethals@usma.edu

, Natalie M Scala

Perhaps one of the most difficult assessments to make with some level of accuracy

is military readiness - it is a frequent topic of interest in defense news both in

times of combat and peace. This research proposes a readiness index tailored to

objectively evaluate units based upon their current status and future mission,

using quality engineering tools as a foundation for measurement. A simulated

comparison of the current and proposed readiness indices is provided to illustrate

their differences in assessing Army units.

MB68