Search Results (1 - 25 of 148 Results)

Sort By  
Sort Dir
 
Results per page  

Shoop, Jessica ASENIOR INFORMATION TECHNOLOGY (IT) LEADER CREDIBILITY: KNOWLEDGE SCALE, MEDIATING KNOWLEDGE MECHANISMS, AND EFFECTIVENESS
Doctor of Philosophy, Case Western Reserve University, 2017, Management
This dissertation explains leader effectiveness in the context of the senior information technology (IT) leader who plays a pivotal role in the execution and delivery of corporate IT services. Considered leaders of leaders, senior IT leaders typically report to the Chief Information Officer (CIO). Using a sequential three-phase mixed methods study the thesis makes four contributions; (1) through qualitative inquiry shows that effective senior IT leaders maintain a balance of domain knowledge and emotional and social aptitudes; (2) develops and validates a four-dimensional scale to measure the level of IT leader domain knowledge; (3) demonstrates nomological and predictive validity of the scale and evaluates the impact of IT leader domain knowledge in solving managerial problems and brokering knowledge to others; (4) the studies combine to a build cohesive argument that leadership credibility wherein technical domain knowledge forms the other component is a critical antecedent for leadership effectiveness. The validation is founded on a sample of 104 senior IT leaders and 490 IT leader subordinates within a global IT service firm. Overall, our findings suggest that the so far neglected effect of IT domain knowledge forms not only an important but vital component influencing overall senior IT leader effectiveness. This has consequences for both established theories of leader credibility and leader effectiveness in highly specialized technical domains. Practically the study underscores the importance of hiring and maintaining senior IT leaders with strong technical credentials.

Committee:

Kalle Lyytinen, Ph.D. (Committee Chair); Jagip Singh, Ph.D. (Committee Member); Genevieve Bassellier, Ph.D. (Committee Member); John King, Ph.D. (Committee Member)

Subjects:

Business Administration; Information Systems; Information Technology; Management

Keywords:

Senior IT Leaders; Leadership Effectiveness; Credibility; Domain Knowledge; Leader Knowledge; Knowledge Mechanisms; Scale Development; Multi-dimensional Scale Validity; Mixed Methods

Boosabaduge, Prasad Priyadarshana FernandoHybrid Recommender System Architecture for Personalized Wellness Management
Master of Science, University of Akron, 2016, Electrical Engineering
This thesis presents a hybrid recommender system that suggests exercises to partici- pants to improve their wellness. The current healthcare costs in the USA exceed $3.8 trillion and the reactive approach to managing health is not sustainable. A focus on keeping healthy people healthy is one approach to addressing this serious societal concern. While recommender systems have been extensively used to recommend items such as movies, books, products and services to users, recommending exercises is a challenging and nuanced problem. The hybrid recommender system proposed here comprises two subsystems. A rule-based system, which incorporates exercise science domain knowledge, first identifies a set of exercises that can be recommended based on the participant’s current level of activity, health status and diet. These choices are presented to the participant to elicit the participant’s preferences for one or more of the exercises. A collaborative filtering based recommender system then identifies alternative exercises based on indicated participant preferences. Simulated results with a spectrum of synthetic participants demonstrate the efficacy of the approach. In the future, the rule-based system can be extended to incorporate knowledge from the domain of behavioral sciences. The recommender system can also be extended to consider factors such as efficacy of the exercises and behavioral outcomes.

Committee:

Sastry Shivakumar, Dr (Committee Chair); Tran Nghi, Dr (Committee Member); Bao Forrest, Dr (Committee Member); Juvancic-Heltzel Judith, Dr (Committee Member)

Subjects:

Computer Science; Information Systems

Keywords:

Personal Wellness Management, Exercise, Recommender System, Expert System

AYDAR, MEHMETDeveloping a Semantic Framework for Healthcare Information Interoperability
PHD, Kent State University, 2015, College of Arts and Sciences / Department of Computer Science
Interoperability in healthcare is stated as the ability of health information systems to work together within and across organizational boundaries in order to advance the effective delivery of healthcare for individuals and communities. The current healthcare information technology environment breeds incredibly complex data ecosystems. In many cases pertinent patient records are collected in multiple systems, often supplied by competing manufacturers with diverse data formats. This causes inefficiencies in data interoperability, as different formats of data create barriers in exchanging health information. This dissertation presents a semantic framework for healthcare information interoperability. We propose a system for translation of healthcare instance data, based on structured mapping definitions and using RDF as a common information representation to achieve semantic interoperability between different data models. Moreover, we introduce an entity similarity metric that utilizes the Jaccard index with the common relations of the data entities and common string literal words referenced by the data entities and augmented with data entity neighbors similarity. The precision of the similarity metric is enhanced by incorporating the auto-generated importance weights of the entity descriptors in the RDF representation of the dataset. Furthermore, we provide an automatic classification method, which we call summary graph generation, based on the pairwise entity similarities, and we propose that the summary graph can further be utilized for interoperability purposes. Finally, we present a suggestion based semi-automatic instance matching system and we test it on the RDF representation of a healthcare dataset. The system utilizes the entity similarity metric, and it presents similar node pairs to the user for possible instance matching. Based on the user feedback, it merges the matched nodes and suggests more matching pairs depending on the common relations and neighbors of the already matched nodes. We propose that the instance matching technique could be leveraged for mapping between separate data models.

Committee:

Austin Melton (Advisor); Angela Guercio (Committee Member); Ye Zhao (Committee Member); Alan Brandyberry (Committee Member); Helen Piontkivska (Committee Member); Javed I. Khan (Committee Chair); James L. Blank (Other)

Subjects:

Computer Science; Health Care; Health Sciences; Information Systems; Information Technology; Medicine

Keywords:

Healthcare Information Interoperability;Semantic Web;RDF;Translation of Instance Data;Summary Graph;RDF Instance Match;RDF Entity Similarity;Automatic Mapping;Information Translation

Geise, GregoryApplication of Geographical Information Systems to Determine Human Population Impact on Water Resources of Yellow Springs, Ohio, and the Use of LiDAR Intensities in Land Use Classification
Master of Science (MS), Wright State University, 2016, Earth and Environmental Sciences
The purposes of the following studies were to investigate natural and human influences on several spatial and temporal aspects of a local and regional environment. The decreasing discharge rate of the ground water supplied Yellow Spring may be caused by the increase in population of the nearby Village of Yellow Springs, Ohio. Periodic measurements of Yellow Spring’s discharge rate compared to changes in the town’s population showed an inverse relationship, where spring discharge declined as population grew. A sharp decrease in discharge occurred during a period when the spring’s facade was modified and an airport was built partially overlying the spring’s recharge area. These events are believed to have had a greater impact on spring discharge rate than changing population because discharge rate remained relatively constant after its sharp decline, while population began to decline. Aquifer volume change was determined by calculating the volume difference between decadal average water tables that were modeled with ArcMap from water well depth to water measurements and LiDAR elevation data. Counterintuitively, aquifer volume generally increased with population then fell sharply as the population gradually decreased. A slight increase in aquifer volume after withdrawal wells were installed suggests that human consumption had little impact on aquifer volume. When compared to the average Palmer Hydrological Drought Index, aquifer volume generally lowered during dry periods, and rose during wet periods. Minor variations in climate can greatly impact aquifer volume because precipitation only needed to have decreased by 0.26 percent over a 40 year period to account for the lowest calculated aquifer volume. Determining the composition and spatial extent of land uses through land use classification increases our understanding of processes that are harmful to the environment. Because of LiDAR’s high spatial resolution, the ability to classify marginally rural land uses of Greene County, Ohio with LiDAR intensity data was assessed to improve the accuracy of land uses previously classified from lower resolution satellite images. Trends in frequency distributions of intensity values extracted from sample sites of six major land uses reveal that LiDAR, measuring in the near-infrared (1064 nm), is spectrally insufficient to distinguish between land use elements (grass, trees, pavement, buildings, etc.), where each intensity value identifies between 3 and 6 land use elements. Land use elements with the same intensity values can be distinguished when remotely sensed data of other wavelengths are added to create spectral variation. The ability to classify land uses with LiDAR intensity data is further reduced by its poor temporal resolution and large file size. LiDAR surveys are typically conducted in early spring when trees are leafless to allow for ground elevation measurements in forested areas. LiDAR .las files are large because of its high spatial resolution, and require significant computing resources to process.

Committee:

Doyle Watts, Ph.D. (Advisor); Songlin Cheng, Ph.D. (Committee Member); Abinash Agrawal, Ph.D. (Committee Member)

Subjects:

Environmental Science; Geography; Hydrology; Information Systems; Physical Geography; Remote Sensing

Keywords:

GIS; LiDAR intensity; LiDAR elevation; water resources; land use classification;

Lipkin, IlyaTesting Software Development Project Productivity Model
Doctor of Philosophy in Manufacturing and Technology Management, University of Toledo, 2011, Manufacturing and Technology Management

Software development is an increasingly influential factor in today’s business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted.

There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis.

This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD.

Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers.

Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc¿¿¿ This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

Committee:

Jeen Su Lim (Committee Chair); James Pope (Committee Member); Michael Mallin (Committee Member); Michael Jakobson (Committee Member); Wilson Rosa (Advisor)

Subjects:

Aerospace Engineering; Armed Forces; Artificial Intelligence; Business Administration; Business Costs; Computer Engineering; Computer Science; Economic Theory; Economics; Electrical Engineering; Engineering; Industrial Engineering; Information Science; Information Systems; Information Technology; Management; Marketing; Mathematics

Keywords:

"Software Estimation"; "Software Cost Model"; "Department of Defense Data"; COCOMO; "Software Project Productivity Model"

Chen, WeiDeveloping a Framework for Geographic Question Answering Systems Using GIS, Natural Language Processing, Machine Learning, and Ontologies
Doctor of Philosophy, The Ohio State University, 2014, Geography
Geographic question answering (QA) systems can be used to help make geographic knowledge accessible by directly giving answers to natural language questions. In this dissertation, a geographic question answering (GeoQA) framework is proposed by incorporating techniques from natural language processing, machine learning, ontological reasoning and geographic information system (GIS). We demonstrate that GIS functions provide valuable rule-based knowledge, which may not be available elsewhere, for answering geographic questions. Ontologies of space are developed to interpret the meaning of linguistic spatial terms which are later mapped to components of a query in a GIS; these ontologies are shown to be indispensable during each step of question analysis. A customized classifier based on dynamic programming and a voting algorithm is also developed to classify questions into answerable categories. To prepare a set of geographic questions, we conducted a human survey and generalized four categories that have the most questions for experiments. These categories were later used to train a classifier to classify new questions. Classified natural language questions are converted to spatial SQLs to retrieve data from relational databases. Consequently, our demo system is able to give exact answers to four categories of geographic questions within an average time of two seconds. The system has been evaluated using classical machine learning-based measures and achieved an overall accuracy of 90% on test data. Results show that spatial ontologies and GIS are critical for extending the capabilities of a GeoQA system. Spatial reasoning of GIS makes it a powerful analytical engine to answer geographic questions through spatial data modeling and analysis.

Committee:

Eric Fosler-Lussier, Dr. (Committee Member); Rajiv Ramnath, Dr. (Committee Member); Daniel Sui, Dr. (Committee Member); Ningchuan Xiao, Dr. (Committee Chair)

Subjects:

Cognitive Psychology; Computer Science; Geographic Information Science; Geography; Information Science; Information Systems; Information Technology; Language

Keywords:

geographic information system; GeoQA; geographic question answering framework; geolinguistics; spatial ontologies;

Leverington, Cheyanna LeighGIS and Spatial Database Expansion as a Means to Enhance Planning, Water Demand Projections and the Impacts of Climate Change: An Internship with the NYC Department of Environmental Protection and a NNEMS Fellowship with the US EPA
Master of Environmental Science, Miami University, 2014, Environmental Sciences
This report details my internship experience with the New York City Department of Environmental Protection (NYCDEP) in the Bureau of Environmental Planning and Analyses (BEPA) and a National Network for Environmental Management Studies (NNEMS) Fellowship experience with the US EPA as a member of the GIS Team. At the NYCDEP my focus was to develop and analyze data, demographics, core infrastructure elements, wastewater, zoning data, land use and water consumption using GIS. The focus of my fellowship with the EPA was to expand the agency's enterprise spatial database by performing cross media analyses and acquiring missing data layers to support wetland protection, environmental compliance and environmental impacts due to climate change. My initial experiences with the internship and fellowship involved familiarizing myself with the subject matter and exploring the existing spatial databases and research tools. Along with learning municipal and federal environmental planning strategies, the majority of my time was spent on creating a map of properties within Combined Sewer Overflow (CSO) drainage areas to facilitate impact analyses of new development and updating, enhancing, and expanding the EPA's spatial database. The map of properties within drainage areas was published by the city of New York in the City Environmental Quality Review (CEQR) Technical Manual and made available online to agencies and residents of New York City.

Committee:

Thomas Crist, PhD (Committee Chair); Suzanne Zazycki (Committee Member); Robbyn Abbitt (Committee Member)

Subjects:

Climate Change; Conservation; Environmental Health; Environmental Science; Information Systems; Natural Resource Management

Keywords:

climate; water; sewer; new york city; combined sewer overflow; GIS, drainage

Pantelopoulos, Alexandros A.¿¿¿¿¿¿¿¿¿¿¿¿PROGNOSIS: A WEARABLE SYSTEM FOR HEALTH MONITORING OF PEOPLE AT RISK
Doctor of Philosophy (PhD), Wright State University, 2010, Computer Science and Engineering PhD
Wearable Health Monitoring Systems (WHMS) have drawn a lot of attention from the research community and the industry during the last decade. The development of such systems has been motivated mainly by increasing healthcare costs and by the fact that the world population is ageing. In addition to that, RandD in WHMS has been propelled by recent technological advances in miniature bio-sensing devices, smart textiles, microelectronics and wireless communications techniques. These portable health systems can comprise various types of small physiological sensors, which enable continuous monitoring of a variety of human vital signs and other physiological parameters such as heart rate, respiration rate, body temperature, blood pressure, perspiration, oxygen saturation, electrocardiogram (ECG), body posture and activity etc. As a result, and also due to their embedded transmission modules and processing capabilities, wearable health monitoring systems can constitute low-cost and unobtrusive solutions for ubiquitous health, mental and activity status monitoring. The majority of the currently developed WHMS research prototypes and products provide the basic functionality of continuously logging and transmitting physiological data. However, WHMS have the potential of achieving early detection and diagnosis of critical health changes that could enable prevention of health hazardous episodes. To do that, they should be able to learn individual user baselines and also employ advanced information processing algorithms and diagnostics in order to discover problems autonomously and detect alarming health trends, and consequently, inform medical professionals for further assistance. In an effort to advance the capabilities of a wearable system towards these goals, we focus in this dissertation on the development of a novel WHMS, called Prognosis. The developed prototype platform includes the following innovative features, which constitute the main research contributions of this work: a) a novel and highly accurate methodology for classifying ECG recordings on a resource constrained device which is based on the Matching Pursuits algorithm and a Neural Network, b) a physiological data fusion scheme based on a fuzzy regular formal language model, whereby the current state of the corresponding fuzzy Finite State Machine signifies the current health state and context of the patient, c) the extension of the decision making methodology based on a modified Fuzzy Petri Net (FPN) model, d) the integration of a user-learning strategy based on a neural-fuzzy extension of the FPN, e) the incorporation of a system-patient dialogue interaction in order to capture non-measurable patient symptoms such as chest pain, dizziness, malaise etc and finally f) the prototyping of the system based on a smart-phone that runs multi-threaded J2ME software for handling multiple simultaneous Bluetooth connections with off-the-shelf wireless bio-sensors.

Committee:

Nikolaos Bourbakis, PhD (Advisor); Soon Chung, PhD (Committee Member); Yong Pei, PhD (Committee Member); Arnab Shaw, PhD (Committee Member); Larry Lawhorne, PhD (Committee Member)

Subjects:

Computer Science; Engineering; Health Care; Information Systems

Keywords:

wearable health monitoring system; ECG classification; ECG denoising; medical decision support system; smart-phone

Jung, YusunA Dialogic Action Perspective on Open Collective Inquiry in Online Forums
Doctor of Philosophy, Case Western Reserve University, 2012, Management
In today’s networked environment, online forums emerge as a popular form of social structures that have greater opportunities for learning in various organizational contexts. A plethora of studies have investigated the phenomenon to identify antecedent of its success, such as individual characteristics and organizational structure. However, how such antecedents get involved in collaborative learning processes and influence their outcomes has been largely understudied. Furthermore, the learning process in online forums has been simply presumed as a kind of general organizational learning, despite its unique situation of learning from strangers. This dissertation study focuses on online forums’ highly motivated for problem-based learning and explores a dynamic process of such learning, namely Open Collective Inquiry (OCI). Presuming that dialogue embodies open collective inquiry processes, this study investigated characteristics of OCI dialogues that influence distinct types of inquiry outcomes using a grounded theory method. In particular, the current study highlights what participants do for OCI and how they do it through their dialogue. Based on distinct purposes for dialogic actions, six action domains were identified that constitute OCI processes: action domains to initiate inquiry, to maintain commitment, to guide inquiry process, to frame a problem, to negotiate solutions, and to confirm workability. These action domains were interrelated to shape OCI processes. Varying extent to which participants performed purposes of these action domains was found to influence distinct types of outcomes, such as full closure, partial closure, non-closure, and degraded closure. To derive a more systemic account of how participants of OCI perform such purposes, three dimensions of dialogic action were proposed: action performed, content of action, and argumentative components. These dimensions were used for characterizing essential dialogic actions in each action domain for successful OCI. In this way, three factors are proposed that influence OCI outcomes: fulfillment of essential dialogic actions, OCI initiators’ role, and inquiry context. Based on these findings, a dialogic action model of OCI in online forums emphasizes OCI initiators’ active roles and inquiry context encouraging validation and improvement. These characteristics influence essential dialogic actions of open collective inquiry that perform reflection, experimentation, and validation. Discussing implications for research and practice concludes this dissertation study.

Committee:

Richard Boland, Jr. (Advisor); Kalle Lyytinen (Committee Member); Richard Buchanan (Committee Member); John Paul Stephens (Committee Member); Youngjin Yoo (Committee Member)

Subjects:

Information Systems

Keywords:

collective action; collaborative learning; online community; open collective inquiry; open innovation; dialogic action; discourse analysis

McCutcheon, Angela M.Impact of Publishers’ Policy on Electronic Thesis and Dissertation (ETD) Distribution Options within the United States
Doctor of Philosophy (PhD), Ohio University, 2010, Curriculum and Instruction Instructional Technology (Education)

The purpose of this study was to determine if large circulation journal publishers were rejecting articles submitted for publication because the submitted articles were derived from Electronic Theses and Dissertations (ETDs). In this study, 403 universities were found to file ETDs in university repositories or in the ProQuest/UMI commercial repository. ETD university personnel were surveyed online and asked to report the number of graduate student alumni who reported publisher rejections for articles submitted for publication, because the articles were derived or taken directly from ETDs. In addition, other data were collected from ETD university personnel regarding ETD program policies and practices to determine if these policies and practices influenced the number of publisher rejections.

The results of this study show that two ETD universities reported three publisher rejections for articles that were submitted for publication because the articles were derived from ETDs. Since a small number of ETD universities personnel reported publisher rejections (1.8% = 2 universities/109 responses), ETD university policies and practices were examined to determine if they were assisting students in avoiding publisher rejections.

Several ETD program policies and practices are aiding students in avoiding publisher rejections. The ETD university distribution options and publication delays offering were flexible enough to allow students to publish from their theses and dissertations even when the students selected the wrong distribution option at the time of graduation. ETD universities within the United States appear to be doing exceptional job at assisting students in publishing articles and books that have been derived from ETDs.

Current ETD programs can move forward with confidence that they have found ways to assist students in avoiding publisher rejections through the types of distribution options offered, publication delays, and through the flexibility in changing distribution options for graduate student alumni when they have difficulties publishing from their ETDs. They can also feel more at ease that publishers appear to be considering ETDs pre-prints in many cases. Yet, ETD universities should remain aware that many publishers are resistant to allowing students to place previously published articles inside their ETDs.

Committee:

David Richard Moore, Ph.D. (Advisor); George Johanson, Ph.D. (Committee Member); Valerie Martin-Conley, Ph.D. (Committee Chair); Candice Maddox, Ed.D. (Committee Member)

Subjects:

Computer Science; Education; Information Systems; Teaching; Technology

Keywords:

ETD; Electronic Thesis and Dissertation; publisher rejection; thesis; dissertation; online publishing; university repository; ProQuest; ProQuest/UMI; online research

Post, David L.Network Management: Assessing Internet Network-Element Fault Status Using Neural Networks
Master of Communication Technology and Policy (MCTP), Ohio University, 2008, Information and Telecommunication Systems (Communication)
This thesis explores the implementation of a neural network for analysis and decision-making purposes within the realm of network management and serves as a stepping stone toward a solution for a much larger problem. The thesis will discuss the current landscape of network management software and explore the possible benefits of a neural network augmented solution, test the efficacy of a neural network in several scenarios where it must classify network element performance data, and compare these results against more familiar linear regression methods. Results showed that the neural network was far superior at classifying the results. However, the general applicability of these results is not yet known since the dataset used was from a single commercial Internet Service Provider.

Committee:

John Hoag (Committee Chair); Philip Campbell (Committee Member); Andy Snow (Committee Member)

Subjects:

Communication; Information Systems; Management

Keywords:

Network management; Neural Network; Fault Analysis

Lusk, David MichaelAn Evaluative Study of User Satisfaction and Documentation Compliance: Using an Electronic Medical Record in an Emergency Department
Master of Science, The Ohio State University, 2010, Allied Medical Professions

With a general lack of knowledge regarding electronic medical records in the ED, the objectives of this study were to solicit user satisfaction and measure documentation compliance of an ED EMR six years post implementation. User satisfaction was measured using a survey instrument, while documentation compliance was measured by conducting a retrospective patient chart audit analysis across two EDs in the same health system. One ED utilized the EMR, while the other was still using paper charting.

ANOVAs were calculated to determine significant difference of socio-demographic variables across survey responses. A total of 106 (35%) users completed the survey and means indicated that respondents were generally satisfied with the ED EMR. Both age range and user role in the ED showed significant difference across survey categories. T-tests and Fisher Exact Tests were calculated to determine significant difference of chart compliance between EDs. Patient chart audits showed an overall significant difference between the EMR (98 percent compliant) as compared to paper charting (86 percent compliant), along with several chart components (14 out of 23) that were significantly different and favored the EMR over the paper process. Both studies conducted should continue to be a permanent, ongoing effort by health systems to continue to measure and improve ED EMR usability and long term sustainability.

Committee:

Melanie Brodnik, PhD (Advisor); Laurie Rinehart-Thompson, JD, RHIA,CHP (Committee Member); Nina Kowalczyk, PhD,RT(R)(CT)(QM),FASRT (Committee Member); Susan White, PhD,CHDA (Committee Member)

Subjects:

Health Care; Information Systems; Technology

Keywords:

electronic medical records; EMR; emergency department; health information technology; user satisfaction; compliance; documentation compliance; ED EMR; ED

Da Silva, Ralston A.Green Computing – Power Efficient Management in Data Centers Using Resource Utilization as a Proxy for Power
Master of Science, The Ohio State University, 2009, Computer Science and Engineering
Many organizations are working towards reducing the carbon footprint of their data centers; i.e. reducing their power consumption. Server virtualization is used to decrease power consumption by consolidating multiple servers onto a few physical machines. Virtualization provides increased flexibility by providing a means to dynamically move virtual machines from one physical machine to another. Using resource utilization as a proxy for power, we build models of power consumption for individual server types, and use this information along with business value and SLA information, to efficiently allocate virtual machines to physical machines

Committee:

Rajiv Ramnath, PhD (Advisor); Jay Ramanathan, PhD (Committee Member); Sivilotti Paul, PhD (Committee Member)

Subjects:

Computer Science; Earth; Energy; Engineering; Industrial Engineering; Information Systems; Management; Technology

Keywords:

green computing; data center; power optimization; carbon footprint; resource utilization as a proxy for power; hardware profiling; application profiling; data center management architecture; power manager;

Chen, WeiThe Design and Implementation of a Web-based GIS for Political Redistricting
Master of Arts, The Ohio State University, 2009, Geography

The World Wide Web (www) has dramatically changed our way of producing, utilizing and consuming information, especially geospatial information in recent years. Web-based GIS (Geographic Information Systems) are designed to provide Web users analytical tools to assist their spatial decisions making process. With the advantages such as platform independence, customizability and cost effectiveness, Open Source Geospatial (OSGEO) software has been more adopted to develop Web-based GIS applications. Also, the increased availability of spatial functionalities in OSGEO software has opened many possibilities towards the implementation of a more powerful, interactive and collaborative Web-based GIS platform which is favorably referred to as the GeoWeb. However, compared with proprietary systems current open source based online GIS systems have several limitations. For example, most of them do not provide customizable web mapping service and spatial data processing service. However, these two types of services are essential to effectively filter spatial information and explore area of interest.

This research introduces a framework of implementing a Web-based GIS using Open Source Software, including Postgresql/PostGIS, MapServer, and OpenLayers. On the server side, Postgresql/PostGIS is used to store and process spatial data. MapServer is adopted to provide Web Mapping Service (WMS). Server side scripting language PHP is employed to dynamically generate map file from PostGIS for MapServer to render. On the client side, OpenLayers provides the programming interface to incorporate layers from different data sources into a same DOM container. Web-based GIS for political redistricting, as an example, has been developed to demonstrate both merits and demerits of adopting this framework.

Initial results of the demonstration show that the integration of PostGIS, MapServer and PHP could facilitate query based map generation and make mapping of massive spatial data efficient. Query based Web Mapping Service is capable to dynamically generate map and legend images. Spatial data handling functions in PostGIS are suitable for developing user interactive functions for querying, measuring and processing spatial data. Users could use the implemented Web-based political redistricting GIS to explore census, devise and evaluate new plan, and compare different plans. This framework based on open environment can be adapted to applications with similar requirements. The application implemented in this research can be access through gis.osu.edu/redistricting.

Committee:

Ningchuan Xiao (Advisor); Mei-Po Kwan (Committee Chair); Daniel Sui (Committee Member)

Subjects:

Computer Science; Geography; Information Systems; Political Science

Keywords:

Web-based GIS; Political Redistricting; Open Source Geospatial (OSGEO); Public Participation

Gardner, John WallaceImproving Hospital Quality and Patient Safety - An Examination of Organizational Culture and Information Systems
Doctor of Philosophy, The Ohio State University, 2012, Business Administration

This dissertation examines the effects of safety culture, including operational climate and practices, as well as the adoption and use of information systems for delivering high quality healthcare and improved patient experience. Chapter 2 studies the influence of both general and outcome-specific hospital climate and quality practices on process of care. Primary survey data from 272 hospitals across the U.S. is combined with process of care performance data reported by the Center for Medicare and Medicaid Services (CMS). The results indicate that general safety climate and quality practices establish an environment in which outcome-specific efforts enable process quality improvement. A split-group structural equation modeling (SEM) analysis shows the employment of practices focused on specific outcome goals is found to relate to higher quality of patient care in smaller hospitals, whereas a climate focused on specific outcome goals is found to relate to higher quality of patient care in larger hospitals.

In Chapter 3, we test the influence of the adoption of healthcare information technologies (HIT) in relation to the use of data and analysis for organizational planning and error reduction. Secondary data on the levels of HIT adoption as reported by HIMSS and the Dorenfest Institute is combined with primary survey data from 2009 on the use and analysis of data in 272 U.S. hospitals; these data are combined with secondary data on hospital performance of process of care and patient satisfaction as reported by CMS. The results of hierarchical regression analyses indicate that HIT adoption and data use and analysis influence outcomes in different ways: hospitals with higher levels of HIT adoption and error data analysis are associated with higher process of care quality, while hospitals with higher levels of organizational data use are associated with higher patient satisfaction.

Chapter 4 examines the use of information systems for developing higher reliability in care in relation to the effects of adhering to specified care and workarounds to limitations in HIT. Secondary data on HIT adoption, CMS process of care, and patient experience of care are combined with primary data from a 2012 survey of hospital quality and nurse directors. The primary survey data measures three key factors: 1) the extent to which care providers use information systems in a patient-focused manner, 2) the extent to which providers adhere to specified care, and 3) the extent to which providers work around limitations in information systems. The results indicate that the contextual factors of hospital size and level of HIT investment play a significant role in understanding the use of healthcare information systems (HIS). For example, the use of HIS with focused attention on the patient is found to have a positive and slightly significant association with process of care in hospitals with more IT adoption, but is found to have a negative association with patients’ perceived experience of care in hospitals with less IT adoption.

This research addresses a crossroads of practical and theoretical implications for the future of patient safety culture and the use of healthcare information systems.

Committee:

Kenneth Boyer (Committee Co-Chair); Peter Ward (Committee Co-Chair); John Gray (Committee Member); Sharon Schweikhart (Committee Member)

Subjects:

Business Administration; Health Care; Health Care Management; Information Systems

Keywords:

healthcare; safety culture; information systems; quality

Beam, Michael A.Personalized News: How Filters Shape Online News Reading Behavior
Doctor of Philosophy, The Ohio State University, 2011, Communication

The evolution and diffusion of communication technology has consistently changed interactions between members of the public sphere in forming public opinion. Some democratic scholars have worried recent developments in personalization technologies will degrade public opinion formation. They worry that personalized news allows citizens to only pay attention to news coming from their preferred political perspective and may isolate them from challenging perspectives. Empirical research has shown people with access to more highly selective information technology demonstrate increases in both selectivity and incidental exposure to diverse perspectives.

This dissertation focuses on these behavioral and attitudinal outcomes of using personalized news technologies. Dual-processing theories of information provide the foundation for analyzing opinion formation within the bounded rationality model of public opinion. Personalized news technologies are hypothesized to increase the amount of news exposure and elaboration through increased personal relevance.

Two studies test these broad hypotheses. First, results from a national random sample of adults show users of personalized web portals are more likely to engage in increased news viewing both online and offline. No differences in preference for perspective sharing or challenging sources of news is found between personalized portal users and non-users. Next, results from an online experiment of Ohio adult Internet users show an increase in time spent reading news articles in personalized news portals compared with a generic portal. An interaction between using customized news portals with source recommendations based off of explicit user preferences and increased time spent reading per news article is found on news elaboration. No differences in news elaboration are found in other personalized news designs including implicitly recommended news sources based on user profile information and only showing users recommended stories. The implications of these results are discussed in terms of the public opinion debate about new communication technologies, selective exposure research, information processing research, and personalized information system design.

Committee:

Gerald M. Kosicki, PhD (Advisor); David R. Ewoldsen, PhD (Committee Member); R. Kelly Garrett, PhD (Committee Member); Andrew F. Hayes, PhD (Committee Member)

Subjects:

Behavioral Sciences; Behaviorial Sciences; Communication; Experiments; Information Systems; Information Technology; Journalism; Mass Communications; Political Science

Keywords:

Internet; personalized; personalization; news; public opinion; politics; election; selective exposure; information processing; portal; web; communication; elaboration

Gadapa, ShaliniAssessing SeeIT 3D, A Software Visualization Tool
Master of Computing and Information Systems, Youngstown State University, 2012, Department of Computer Science and Information Systems
Software is inherently complex. This is especially true for large open-source systems. Over the past two decades there has been a number of software visualization tools proposed in the literature. The main idea behind creating a software visualization tool is to help a developer or maintainer comprehend the system at different levels of abstraction. Most of the tools have focused on creating elaborate and pretty looking visualizations. There have not been many cases where a tool is systematically empirically validated to make sure that it is really useful to a developer. This thesis tries to bridge this gap between the tool and its empirical validation by assessing one such software visualization tool, SeeIT 3D. Sixteen different tasks are developed in the context of understanding an open-source system, JFreeChart, written in Java. Ten subjects were recruited and an observational study was performed. The main goal was to determine the effectiveness of SeeIT 3D while performing typical software tasks when using the visualization within the Eclipse IDE. Results and observations are presented. These results will be provided as feedback to the tool developers, who may use it in further improving SeeIT 3D.

Committee:

Bonita Sharif, PhD (Advisor); John Sullins, PhD (Committee Member); Yong Zhang, PhD (Committee Member)

Subjects:

Computer Science; Information Systems; Information Technology

Keywords:

visualization tool; visualizing JFreeChart in SeeIT 3D; SeeIT 3D metaphor; polycylinders (visual type relations)

Vadde, Susheel ReddyImproving Tissue Elasticity Imaging Using A KALMAN Filter-Based Non-Rigid Motion Tracking Algorithm
Master of Computing and Information Systems, Youngstown State University, 2011, Department of Computer Science and Information Systems
Imaging biomechanical properties of biological tissues under deformation has strong implications to many medical applications such as cancer detection and surgery planning. Quantifying the elasticity of soft tissue using an optical sensor is particularly attractive because it is non-invasive and easy to operate. However, the current computing method is plagued by the existence of noises in the two-frame optical flow solutions. In this thesis, a Kalman filter based tracking algorithm is examined, aiming to improve the quality of cumulative motion over a long video sequence. The proposed method is robust and is capable of handling non-rigid motion that is typical of soft tissue. Experiments of using videos of four rat tissue specimen subject to a biomechanical tensile test indicates that the proposed tracking method is very promising in generating a smooth, accurate, and continuous multiframe motion field. This type of multi-frame motion data not only allows us to compute a more accurate individual strain elastogram, but also provide valuable information for calibrating a series of relative strain images over the entire deformation process.

Committee:

Yong Zhang, PhD (Advisor); John Sullins, PhD (Committee Member); Kriss Schueller, PhD (Committee Member)

Subjects:

Computer Science; Information Systems

Keywords:

Kalman Filter; Elastography; Multi Frame Motion

Walters, Craig M.Application of the human-machine interaction model to Multiple Attribute Task Battery (MATB): Task component interaction and the strategy paradigm
Master of Science (MS), Wright State University, 2012, Biomedical Engineering
The Multiple-Attribute Task Battery (MATB) is composed of four simultaneously running components to which a human operator responds. A prior report has quantified information content as a machine input baud rate using the Hick-Hyman and Fitt's Laws for three of the four components and defines a strategy function. This report covers methods to quantify information content of the fourth component, creating a single metric which describes overall task complexity and evaluates human performance and strategy. Six MATB task-scenarios (combinations of two, three, or all four MATB components) each at two input baud rates are evaluated. Subjects were also provided with a chart that shows information weighting of each MATB component. Results show a change in strategy paradigm between medium input baud rate and high input baud rate for the six task-scenarios collectively. This likely occurs because subjects only refer to the component weighting chart for strategy formulation when performing more challenging task-scenarios. Advancements made with this thesis give a better understanding of how humans process information during multitasking, provide a simpler and more effective metric for analyzing MATB human performance, and create a foundation for further model development.

Committee:

Chandler Phillips, MD (Advisor); David Reynolds, PhD (Committee Member); Richard A. McKinley, PhD (Committee Member)

Subjects:

Information Systems

Keywords:

HMI; MATB; human performance; human machine interaction; multiple attribute task battery; multi attribute task battery; strategy; strategy paradigm; human; information theory; quantitative informatic model; system complexity metric; task interaction

Makiya, George KidakwaA Multi-Level Investigation into the Antecedents of Enterprise Architecture (EA) Assimilation in the U.S. Federal Government: A Longitudinal Mixed Methods Research Study
Doctor of Philosophy, Case Western Reserve University, 2012, Management

This dissertation reports on a multi-dimensional longitudinal investigation of the factors that influence Enterprise Architecture (EA) diffusion and assimilation within the U.S. federal government. The study uses publicly available datasets of 123 U.S. federal departments and agencies, as well as interview data among CIOs and EA managers within select Federal Government agencies to conduct three multi-method research studies: 1) a qualitative study to investigate organizational and institutional factors that enhance or impede EA assimilation at program level; 2) a quantitative study to examine the antecedents of EA assimilation at adopter unit level and 3) a longitudinal quantitative study to examine: 1) the antecedents of EA assimilation within adopter populations as marked by prominence within each of the EA assimilation phases 2) the influence of sudden changes in environmental (institutional) context on the EA assimilation process; and 3) the determinants for each EA assimilation stage. I use time-lagged partial least square, ordinary least square and multinominal logistic regression to analyze these effects. The study shows that an innovative leadership style is the key to advancing EA program assimilation within adopter units. Framing and labeling of an EA program as an administrative driven innovation or reform as opposed to a business essential strategic tool greatly influences its value perception, adoption and assimilation. Institutional coercive pressure is not a long term sustainable strategy in driving EA assimilation, though it has a “jolt” like short term effect in accelerating assimilation. EA assimilation has distinct micro and macro level antecedents. Factors also have "differently-directioned effects," that is factors that promote EA progress at certain assimilation phases and stages inhibit progress at other phases and stages. Changes in the temporal environmental context have “factor elasticity” effect on the explanatory power of the antecedents. That is, antecedents lose and regain their explanatory power commensurate with changes in the environment over time.

Overall, the study’s findings have several major implications for policymakers: 1) complex administrative innovations such as EA require strategic frameworks as opposed to blueprints to 1) overcome dynamic complexity and 2) drive multi-level assimilation; 2) EA assimilation at each of the levels have different sets of definitions and antecedents 3) each of the levels have different properties and characteristics and require different approaches and strategies, 4) institutional coercive pressure is only effective when applied as a temporal strategy, 5) individual EA program and adopter unit assimilation are interdependent. That is, successful assimilation of EA within the organization is highly dependent upon the degree of embracement and legitimization of individual EA programs.

Committee:

Kalle Lyytinen, PhD (Committee Chair); Bo Carlson, PhD (Committee Member); Richard Boland, PhD (Committee Member); Jeanne Ross, PhD (Committee Member)

Subjects:

Information Systems

Keywords:

enterprise architecture; diffusion of innovation; assimilation of innovation; coercive pressure; qualitative research; quantitative research

Mo, DengyaoRobust and Efficient Feature Selection for High-Dimensional Datasets
PhD, University of Cincinnati, 2011, Engineering and Applied Science: Mechanical Engineering
Feature selection is an active research topic in the community of machine learning and knowledge discovery in databases (KDD). It contributes to making the data mining model more comprehensible to domain experts, improving the prediction performance and robustness of the model, and reducing model training. This dissertation aims to provide solutions to three issues that are overlooked by many current feature selection researchers. These issues are feature interaction, data imbalance, and multiple subsets of features. Most of extant filter feature selection methods are pair-wise comparison methods which test each pair of variables, i.e., one predictor variable and the response variable, and provide a correlation measure for each feature associated with the response variable. Such methods cannot take into account feature interactions. Data imbalance is another issue in feature selection. Without considering data imbalance, the features selected will be biased towards the majority class. In high dimensional datasets with sparse data samples, there will be many different feature sets that are highly correlated with the output. Domain experts usually expect us to identify multiple feature sets for them so that they can evaluate them based on their domain knowledge. This dissertation aims to solve these three issues based on a criterion called minimum expected cost of misclassification (MECM). MECM is a model independent evaluation measure. It evaluates the classification power of the tested feature subset as a whole. MECM has adjustable weights to deal with imbalanced datasets. A number of case studies showed that MECM had some favorable properties for searching a compact subset of interacting features. In addition, an algorithm and corresponding data structure were developed to produce multiple feature subsets. The success of this research will have broad applications ranging from engineering, business, to bioinformatics, such as credit card fraud detection, email filter setting for spam classification, gene selection for disease diagnosis.

Committee:

Hongdao Huang, PhD (Committee Chair); Sundararaman Anand, PhD (Committee Member); Jaroslaw Meller, PhD (Committee Member); David Thompson, PhD (Committee Member); Michael Wagner, PhD (Committee Member)

Subjects:

Information Systems

Keywords:

Feature Selection;Data Mining;Machine Learning;Statistical Modeling;Knowledge Discovery in Database

Ghosh, SuvankarEssays on Emerging Practitioner-Relevant Theories and Methods for the Valuation of Technology
PHD, Kent State University, 2009, College of Business Administration / Department of Management and Information Systems

This dissertation comprises of a set of three essays on emerging practitioner relevant theories and methods such as Real Options (RO) and Economic Value Added (EVA) for the valuation of investments in technology. The first essay develops an innovative approach for assessing practitioner relevance of academic research that is based on determining Granger causality between academic and practitioner interests in a given topic, as proxied by publication activity on that topic. The academic and practitioner interests are modeled as a two-component vector autoregressive (VAR) process and in addition to gauging Granger causality, which is done on stabilized components of the VAR model, I also utilize cointegration to evaluate the equilibrium relationship between the components of the VAR regardless of their stationarity. This model is tested on the two topics of EVA and RO.

The second essay develops an alternative to the Technology Acceptance Model (TAM) called the Methodology Adoption Decision Model (MADM) for the adoption of new methodology by a firm. Analogous to the TAM, the MADM is a parsimonious model which views the theoretical soundness and the practical applicability of a methodology as the key drivers of firm-level adoption of methodology. The theoretical soundness and practical applicability are proxied by the sentiments expressed in the academic and practitioner literatures on the methodology in question. The MADM is used to assess the comparative likelihood of adoption of EVA and RO based on a sentiment extraction experiment for determining the inclinations of the academic and practitioner communities towards EVA and RO.

The third essay applies RO to the context of investments by firms in XML-based enterprise integration (EI) technology. An interpretive hermeneutic approach is employed to develop a set of decision-making heuristics for the exercise of real options that optimize the RO value construct of Strategic Net Present Value (SNPV). This decision-making framework is characterized by decision-context uncertainty and firm-level capability with XML-based EI technology. The heuristics provide managerial prescription on preferred strategies for exercising real options such as whether the firm should deploy an Enterprise Services Bus (ESB), versus an EAI Suite, under given conditions of uncertainty and firm capability for building the enterprise integration infrastructure of the firm.

Committee:

Marvin Troutt, PhD (Committee Chair); Alan Brandyberry, PhD (Committee Member); Felix Offodile, PhD (Committee Member); John Thornton, PhD (Committee Member)

Subjects:

Finance; Information Systems; Operations Research

Keywords:

Real Options; Economic Value Added; EVA; Enterprise Integration; XML; Time Series Analysis; Granger Causality; Cointegration; Multivariate Analysis of Variance

Ponziani, Kevin R.Control System Design and Optimization for the Fuel Cell Powered Buckeye Bullet 2 Land Speed Vehicle
Master of Science, The Ohio State University, 2008, Electrical and Computer Engineering
The Buckeye Bullet 2 is a hydrogen fuel cell electric land speed vehicle, designed to run on the Bonneville Salt Flats in Utah. To operate the vehicle, a number of control systems are utilized, and are controlled through a high-level supervisory controller. By integrating a complex network of sensors, and inputs from the driver, the controller is able to safely and effectively maximize the power at the wheel to increase the vehicle's top speed. A power management scheme using rule-based control is used to manage the power between a motor controller and fuel cell controller. Additionally, control is developed to manage the thermal energy generated by the fuel cell reaction, and maintain a constant operating temperature. Various safety systems are integrated to the central system as well. Finally, methods are presented to process data following a speed trial, and improve diagnosis, improvement, and optimization of the vehicle systems.

Committee:

Giorgio Rizzoni, PhD (Advisor); Steve Yurkovich, PhD (Committee Member)

Subjects:

Electrical Engineering; Energy; Engineering; Information Systems

Keywords:

Land Speed Racing; Control Systems; Hydrogen Power; Fuel Cells; Electric Racing

Haley, Jason S.Climatology of Freeze-Thaw Days in the Conterminous United States: 1982-2009
MA, Kent State University, 2011, College of Arts and Sciences / Department of Geography
Freeze thaw cycles affect almost every part of the conterminous United States on an annual basis in various forms including road and home damage. This study examines the climatology of freeze thaw cycles from 1982-2009 in this area using GIS and statistical analysis. This study includes: An overview of annual and monthly freeze thaw, changes in annual and seasonal freeze thaw, and an urban/rural examination of freeze thaw in the conterminous United States.

Committee:

Scott Sheridan, Dr. (Advisor); Thomas Schmidlin, Dr. (Committee Member); Emariana Taylor, Dr. (Committee Member)

Subjects:

Climate Change; Information Systems

Keywords:

Freeze Thaw Climate GIS

Pschorr, Joshua KennethSemSOS : an Architecture for Query, Insertion, and Discovery for Semantic Sensor Networks
Master of Science (MS), Wright State University, 2013, Computer Science
With sensors, storage, and bandwidth becoming ever cheaper, there has been a drive recently to make sensor data accessible on the Web. However, because of the vast number of sensors collecting data about our environment, finding relevant sensors on the Web and then interpreting their observations is a non-trivial challenge. The Open Geospatial Consortium (OGC) defines a web service specification known as the Sensor Observation Service (SOS) that is designed to standardize the way sensors and sensor data are discovered and accessed on the Web. Though this standard goes a long way in providing interoperability between sensor data producers and consumers, it is predicated on the idea that the consuming application is equipped to handle raw sensor data. Sensor data consuming end-points are generally interested in not just the raw data itself, but rather actionable information regarding their environment. The approaches for dealing with this are either to make each individual consuming application smarter or to make the data served to them smarter. This thesis presents an application of the latter approach, which is accomplished by providing a more meaningful representation of sensor data by leveraging semantic web technologies. Specifically, this thesis describes an approach to sensor data modeling, reasoning, discovery, and query over richer semantic data derived from raw sensor descriptions and observations. The artifacts resulting from this research include: - an implementation of an SOS service which hews to both Sensor Web and Semantic Web standards in order to bridge the gap between syntactic and semantic sensor data consumers and that has been proven by use in a number of research applications storing large amounts of data, which serves as - an example of an approach for designing applications which integrate syntactic services over semantic models and allow for interactions with external reasoning systems. As more sensors and observations move online and as the Internet of Things becomes a reality, issues of integration of sensor data into our everyday lives will become important for all of us. The research represented by this thesis explores this problem space and presents an approach to dealing with many of these issues. Going forward, this research may prove a useful elucidation of the design considerations and affordances which can allow low-level sensor and observation data to become the basis for machine processable knowledge of our environment.

Committee:

Krishnaprasad Thirunarayan, Ph.D. (Advisor); Amit Sheth, Ph.D. (Committee Member); Bin Wang, Ph.D. (Committee Member)

Subjects:

Computer Science; Geographic Information Science; Information Systems; Remote Sensing; Systems Design; Web Studies

Keywords:

Semantic Web; Sensor Web; Linked Data; Semantic Sensor Web; Sensor Data; Sensor Web Enablement; Sensor Observation Service;

Next Page