Institute of Information Science
Network System and Service Laboratory
Principal Investigators:
:::Jan-Ming Ho :::Ling-Jyh Chen :::Meng-Chang Chen :::Sheng-Wei Chen
:::Wen-Tsuen Chen :::Tyng-Ruey Chuang :::Jane W. S. Liu

Our research address several aspects of network systems and services, including the design of high-quality, energy-efficient, and secure environments for wireless sensing and networking, improving delay-tolerant network protocols, leveraging human computation capabilities to address key challenges, developing critically needed information and communication technologies for disaster management, and seeking solutions to network computational problems when providing large-scale financial risk management services.

Wen-Tsuen Chen
Mobile video streaming has become a dominant service, and consequently traffic scheduling has become an important issue to ensure fairness and provide better quality of service (QoS). Existing QoS schemes are not designed for video traffic, and they do not consider the different features of videos; this inevitably results in a bad experience for the user. Thus, we have used software-defined networking (SDN) to build architecture to reserve bandwidth, and route video traffic based on network resources and video features, such as bit rate. Through this process, we can guarantee video quality, and also improve network utilization.

Transmission mechanisms in the traditional personal communication service (PCS) networks and in the next-generation broadband all-IP networks are different. Therefore, the mobility models used in traditional PCS networks are not adaptive to 4G cases. In order to capture the essence of 4G network behavior, we propose a new mobility model to derive analytical results for telecommunications traffic. We observe that the derived analytical results can bring significant insight into the department of network infrastructure. We also present a long-range wake-up radio (LRWUR), and propose an overall metric for evaluating the Internet of Things (IoT) and wireless sensor networks (WSN). Experimental results have shown that block orthogonal codes improve both the communication range and packet error rate, thereby enhancing the wireless sensor node’s performance and reducing the cost of maintenance for IoT and WSN applications. As regards security and privacy problems in intelligent sensing and networking applications, we are addressing the challenges of access control in both WSN and online social networks (OSNs). The main challenges posed are not only the unique security and privacy problems in wireless networks, but also the computation and communication efficiencies required for energy-constrained devices (e.g., wireless sensor nodes and smart phones). We aim to design stronger secure sensing and networking systems which do not consume the limited resource of mobile devices.

Ling-Jyh Chen
Furthermore, we study networked sensing systems, with an emphasis on both energy efficiency and large-scale sensor data management. We have developed an adaptive GPS scheduling algorithm to prolong the lifespan of GPS-enabled mobile sensors, and have designed a hybrid location-sensing approach to strike a balance between power consumption and information accuracy by combining multiple sensors of different energy profiles and data granularity. We have proposed a lightweight and lossless data compression algorithm for spatio-temporal data management, and designed a set of data query algorithms to support spatio-temporal data computation using the compressed data directly. The results of our research have been implemented in two real-world networked sensing systems: one is a mission-critical sensor network, called YushanNet, for hiker tracking, search, and rescue in Yushan National Park, and the other is a participatory sensing system, called TPECMS, for comfort measurement of public transportation systems in the Taipei metropolitan area.

Sheng-Wei Chen
Applications of human computation range from the exploitation of unsolicited user contributions, such as using tags to aid understanding of the visual content of yet-unseen images, to utilizing crowdsourcing platforms and marketplaces (e.g., Amazon’s Mechanical Turk) to micro-outsource tasks (such as semantic video annotation) to a large population of workers. Further, crowdsourcing offers a time- and resource-efficient method for collecting large inputs for system design or evaluation purposes. We are applying crowdsourcing to optimize computer systems more rapidly, and to address human factors more effectively. In the past few years, we have performed extensive studies on the performance of GWAP (Games with A Purpose) systems and designed a human computation game in order to efficiently collect diverse user annotations. In addition, we have proposed a cheat-proof framework that can be used to effectively assess the quality of experience provided by multimedia content. We have found that crowdsourcing is indeed a powerful strategy that can draw collective intelligence for AI-hard problems. In the future, we will continue to study how to effectively use crowdsourcing to overcome challenges in a variety of areas.

Jane W. S. Liu
A disaster management information system (DMIS) facilitates the access, use, and presentation of data and information by application systems and services that support decisions and operations during all phases of disaster management. State-of-the-art DMIS have several common limitations: they cannot make effective use of all available information sources during emergencies; they do not exploit synergistic information from networks of things and crowds of people; and they are not sufficiently agile in response to changes in the disaster situation. We are collaborating with researchers in the Institute of Earth Sciences and with engineering faculty members from several leading universities in Taiwan and the USA to develop an open framework for building a DMIS that is free of these limitations. Our work is now supported by an Academia Sinica, Sustainability Science Research project called OpenISDM (Open Information Systems for Disaster Management). The current projects being undertaken by members of our laboratory include the development of smart cyber-physical devices and applications as elements of a disaster-prepared smart environment; algorithms and tools to support the collection and fusion of human sensor data contributed by crowds of people together with surveillance data from in situ physical sensors; methods and tools supporting communication and computation infrastructures for gathering, caching, fusing, and distributing ubiquitous and heterogeneous real-time streams of sensor data and information to response centers and individual responders and volunteers during disasters; exploitation of complementary merits of different network access technologies, approaches, and network types to make the physical connectivity as robust as possible during and after disasters; and a combined named-data-networking and softwaredefined- networking framework for enhancing disaster resilience of communication infrastructures.

Jan-Ming Ho
We are also interested in studying network computation problems that arise when providing large-scale risk management services. Despite the long history of the development of economic and financial theories and practices in financial risk management, the “worldwide credit crisis” in 2008 demonstrated the vulnerability of the current financial industry to risks. Several examples show that even the three major rating agencies were unable to efficiently report major default events. In Enron’s bankruptcy case in 2001, its bonds maintained “investment grade” ratings until five days before the company declared bankruptcy. In Lehman Brother’s case in 2008, they still received “investment grade” ratings on the morning they declared bankruptcy. The rating companies claim that their rating reports provide a long-term perspective rather than providing an up-to-minute assessment. In rating credit of a company, there are hundreds of firm-specific and macroeconomic variables. There is no doubt that assessing credit risk in real-time is indeed a task with high computational complexity. Nevertheless, it is an important foundation for maintaining stability of the financial market. Complementary to the research into economic and financial theories and practices in risk management, we aim to develop computing technologies for provision of large-scale, real-time financial risk management services, including (1) real-time rating of company credit; (2) real-time rating of personal credits; and (3) pricing of financial products.

排版插圖

排版插圖

排版插圖

排版插圖

排版插圖

More

TOP
Academia Sinica Institue of Information Science Academia Sinica