Wednesday 5 December 2018

Green Public Computer Lab using Single-Board Computer and Interactive Computer System

The new computer reservation system is designed for public free computer lab in education institutes. The new system shut down the idle computer in the lab (computer without reservation) which can significantly lower electricity bill. Most students can faster sign-in to the lab and fairly share computer resource among other students through the interactive kiosk and touch screen. A low-power interactive kiosk is designed using Raspberry Pi and touch screen monitor.

1. Raspberry Pi
The Raspberry Pi is a credit card size single-boarded computer with built in graphic processor and network interface. It draws only approximately 2W of electricity but its computing performance is very efficient. Raspberry Pi 2, Model B with 900MHz Quad-Core ARM CortexA7 CPU was used as an interactive kiosk. This kiosk is using touchscreen monitor and connected with barcode scanner for user authentication.

2. Interactive kiosk
They provide useful information and some services to the users. The integration of touchscreen monitor makes the interactive kiosk more useful and more interesting.
The authors use interactive kiosk in the museum for providing 3-D display of items to visitors. Moreover, the works in developed the interactive map controlled with pointing gesture. Pidora Linux was used for the operating system installed on Class 10 SD-Card on Raspberry Pi 2. The Raspberry Pi 2 will provide web browser for computer reservation system which fetched from a web server on a virtual machine.

3. Design of new system
To integrate and control every computer in the lab, a lightweight program was developed which can be run as a Windows service. Program develop using Microsoft Visual Studio Express 2012. First, the program automatically initializes itself when the booting of client is finished (delayed start). Next, the program makes a connection to web server through REST API and query if it’s reserved. If there is a reservation, it will keep polling on web server every N seconds (N=30 for this implementation). Otherwise, it will shutdown itself through Windows API. Each reservation length for 2 hours.

Reference :

Dolwithayakul, B., Boonnasa, P., Klomchit, S., & Tuwachaosuan, S. (2016). Green public computer lab using single-board computer and interactive computer reservation system. ICSEC 2015 - 19th International Computer Science and Engineering Conference: Hybrid Cloud Computing: A New Approach for Big Data Era, 1–4. https://doi.org/10.1109/ICSEC.2015.7401420

New Development in Computer Software

1. Knowledge-based expert system
Knowledge-based expert system is an intelligent program that uses knowledge and inference procedures to solve problems that would be difficult to solve without human expertise.

2. Human information processor
 One of the popular models of the human information processor breaks it down into three major subsystems which are the perceptual subsystem, a cognitive subsystem, and a motor subsystem. The perceptual subsystem accepts inputs from the external world through its sensors, the cognitive subsystem interprets the inputs and triggers the motor subsystem to react to the input.
 Stimuli from the external world are the inputs to the human information processor. These stimuli are picked up by sensors like the eyes, ears, skin and the nose. The perceptual subsystem consists of these sensors, and associated buffer memories images collected by the sensors are temporarily stored in these buffer memories before they are passed on to the cognitive subsystem for processing.
The cognitive subsystem consists of a working memory or a short-term memory, a long-term memory and a cognitive processor.Shortly after sensory information is put into the sensory buffers by the sensors of the perceptual subsystem, the cognitive processor symbolically encodes the input and puts it into the working memory of the cognitive subsystem.The cognitive processor is like the central processing unit of a computer. It cycles periodically and recognizes that there is input in the sensory buffers that it needs to pick up and put in the working memory of the cognitive system.
The motor system is the final piece of the jigsaw puzzle, that is the human information-processing system. It is translated into action by motor processors which activate voluntary muscles which in turn results in some observable activity.

3. Expert systems
The basic structure of an expert system contains a knowledge base similar to the long-term memory of an expert, and it contains an inference mechanism that will pick up the appropriate information from the knowledge base like the cognitive processor in the human information processor. It has a working memory which holds the results of the last thought and the input that will trigger the next thought.
The knowledge base is made up of rules and facts much like the long-term memory of an expert. Representation of knowledge usually takes the form of rules operating on facts about objects of interest in the domain over which the expert system professes expertise.  
The second component of an expert system is the inference engine which operates on the knowledge base of the expert system in its search for a solution. The inference engine has two parts to it – the inference part and the control part. The inference engine handles incomplete information by allowing the rules to fail when information necessary to evaluate the premisses of these rules is not available.
This portion addresses two primary problems. First, an expert system must have a way to decide where to start. Rules and facts reside in the knowledge base. There must be a way for the reasoning process to begin. Second, the inference engine must resolve conflicts that occur when alternative lines of reasoning emerge. It could happen, for example, that the system reaches a point at which two or three rules can test to be true. The inference engine must choose which rule to examine next. 


Reference : 
Authors, F. (2005). New developments in computer software.

Managing and Controlling Computer misuse

1. Understanding the phenomena of “insecurity”
The UK Audit Commission has defined computer fraud as any fraudulent behaviour connected with computerisation by which someone intends to gain dishonest advantage.
Computer fraud falls into three broad categories which are input, throughput and output fraud. Input frauds are the easiest to commit and are carried out by entering false or manipulated information into the computer systems. In most cases input frauds are carried out by insiders who have access to the systems and who probably have responsibility to input certain kinds of data. The second category of frauds are the throughput frauds. The throughput frauds are most lethal of the computer-related crimes. Various other kinds of throughput frauds have resulted in either heavy losses for the companies or have even led to the demise of major institutions. The third category of frauds are the output frauds. The output frauds are relatively unsophisticated as compared to other kinds of frauds. Usually these occur in conjunction with the input frauds (when they are conducted to conceal bogus input). In other cases the output of the computer systems is misused in various ways. As in the case of input frauds, the majority of output frauds are also carried out by internal employees of a concern.

2. Managing and controlling computer misuse
Implementation of a broad range of interventions. These can be classified into three categories which are technical, formal and informal. Typically an organization can implement controls to limit access to buildings, rooms or computer systems (technical interventions). Commensurate with this, the organizational hierarchy could be expanded or shortened (formal interventions) and an education, training and awareness program put in place (informal interventions).

  • Technical interventions
Implementation of technical controls is conceptualized in a rather narrow and a mechanistic manner. Although notable advances have been made in the area of identifying risks and in establishing relevant countermeasures, but the implementation has taken the form of simple access control mechanisms.

  • Formal interventions
Formal interventions pertain to reorienting information and security practices around a reorganized structure. If an organization has created new processes for conducting its business, then adequate emphasis needs to be placed on developing controls and rules that reflect the emergent structure.

  • Informal interventions
Increasing awareness of security issues. Increased awareness should be supplemented with an ongoing education and training program.


Reference : 
Dhillon, G. (1999). Managing and controlling computer misuse. Information Management & Computer Security, 7(4), 171–175. https://doi.org/10.1108/09685229910292664



Computer Literacy and Libraries


Computer literacy is whatever understanding, skills, and attitudes one needs to function effectively within a given social role that directly or indirectly involves computers , whatever a person needs to be able to do with computers and know about computers in order to function in an information-based society and that compendium of knowledge and skills which ordinary educated people need to have about computers in order to function effectively at work and in their private lives.

Online Public Access Catalogues (OPACs) enable library users to retrieve their information themselves with minimal help from library personnel. enormous online databases were made available by several vendors such as Dialog, SDC, Wilsonline, BRS, LEXIS, Data-Star, ESA-IRS, and so on and these could be searched by anyone who possessed a terminal and a modem

Problem concerning computerisation in libraries
First,computer fear, anxiety and resistance. fears concerning the computer itself, that is, jargon, technological trends, the 'paperless society'; worries about damaging the computer or databases or making mistakes in general; not being able to learn how to use the computer; and showing an inability to type.  Added to these are fears about computers affecting jobs: job security; downgrading of jobs; fewer career prospects, lack of job interest, greater job isolation and fewer human contacts, fear that the computer will act as 'big brother' and supervise more closely and worsening relationships with users.
Next, clash of methodologies. It concerns the great variety of information retrieval methodologies required to access online databases offered by a particular vendor, identical online databases offered by different vendors, on-disc databases published by vendors which differ from those same databases made available online, and hence require different search methodologies and various 'library packages' on which OPACs and in-house databases are created, the majority of which differ widely in retrieval methods, retrieval terminology, and so on.
Other than that, there are miscellaneous problems. There are times when top management person will introduce new software to librarians or information officers to solve some library problem although it is not suitable for the task. There is also online vendor, CD-ROM publisher and library package marketer that rarely lives up to its promise. Combined with impenetrable error messages, which are not standardised either, and with manuals obviously not written for those unfamiliar with the most abstruse computer jargon, it is not surprising that library personnel, even when computer literate, feel frustrated. Formal training programmes for software and hardware are not frequently available and although it is available, they are often very expensive. There are still some institutions and companies who insist on computerising their libraries on their large mainframes. However, they will not buy the appropriate software packages but constrain their computer departments to develop their own software 'in consultation with the library staff'. Unfortunately, this frequently leads to trouble. Librarians often have not enough knowledge of what the computer is capable of, and of computer jargon to make their requirements understood by the computer staff.  

Solutions to overcome computer anxiety in library staff
First, through communication. Long before any automation project is begun, staff must be informed, indeed, should be consulted during the planning process. Any computerisation planned by management in conjunction with staff will be welcomed and eagerly anticipated by staff. Full information concerning how computerisation will affect staff with regard to relieving their more boring routine tasks, their involvement in computerisation, the proposed methods of training, even job descriptions of those posts involved more intimately and directly with computerisation, must be furnished well in advance. Management must be seen to be enthusiastic, but willing to make computerisation a team effort, with contributions by every member of staff valued and requested.
Next, through training. It is most important, of course, to provide adequate training in the various packages and methodologies, whether formal, that is, provided by an outside agency, or informal, that is, provided by a fully-trained staff member.  certain applications softwares, such as wordprocessing and spreadsheet programmes, are frequently expected to be learnt 'off the screen' since they are so well known that everyone is presumed to be familiar with them. Another important strategy is to suit training to the stage of computer literacy of the trainees.  trainers should make sure that personnel are not trained in too many different software procedures all at once.  getting staff interested in certain types of problem-solving computer games will make them learn to be computer literate in no time. Staff should learn to view computerisation as a means of getting rid of many of the more boring library tasks, leaving time for the more interesting, problem-solving tasks.Training should be made as interesting as possible. There are now many simulation programmes and videotapes available which should all be used in training, although they do not necessarily take the place of hands-on experience.
Other than that, through support. Those institutions and companies in possession of a computer department should make sure that this department acts in an advisory and maintenance capacity for library hardware.
Lastly, through user groups.  creation of user groups for particular software or methodologies, for instance the STAIRS, Online and INMAGIC user groups.  Experiences may be shared and what is even more important, expensive training may be shared. Good ideas for training, for the use of programmes, and for evaluation of results may be disseminated at meetings. Comparisons of search methods are most helpful in self evaluation. Even databases may be shared


Reference : 
Rodríguez, K., & Rodríguez, K. (1994). The Electronic Library.

Computer Abuse

There are 3 fundamentals gaps in which computer security enforces and the way computer systems are used.first,technological.second,sociotechnological and lastly social.
1. First gap : computer mechanism vs computer policy
There is a significant gap in most existing systems between the computer security that is desired or thought to be in effect and what is inn fact implement.with discretionary access control,some user can access through some resources that are not exactly accessible for them.the flaw in technical system such as hardware malfunctions or virus attack can make unauthorized user access some information easily.

2. Second gap : computer policy vs social policy
Exist a gap between the policy the computer controls are intended to enforce and the desired social policies.not all the social policies such as for privacy,copyright protection, data correctness and integrity and human safety will enforceable by computer policies.

3. Third gap : social policy vs (anti)social behavior  
Significant gap between desired human behavior and actual human behavior exist.it is also can exist on any particular computing system that affect wide range of people including users, administrators, developers, maintainers and etc.difficulty in preventing the intentional abuse of authorized access and the problem become worse because of omnipotent superuser and administrative privilege.the authorized users also provided with new opportunities and temptations for the invasion of privacy of others.there is also the result of unintentional misuse.dependence on bad data will affect the system behavior and human behavior seriously.when security policies are established, it is important to pay attention towards human behavior that would not always perfectly, legally, ethically or even acceptably.

Potentials for computer attacks
Potential for sabotage, espionage and computerized terrorism is alarming. Attacks appear to be escalating in step with improving defenses. It is important to recognize the need to impede multiperson collusions other than single-user attacks.

Defending against computer system misuse
We need to be aware towards the user of system who innocently exercises a Trojan horse, developer/maintainer of a system who accidentally install fundamental flaw, the administrator who mistakenly trusts someone or some program that is trustworthy or others. There are more applicable technology that can contribute defenses such as systems satisfying the advanced criteria 9B2 or better) of the National Computer Security Center Trusted Computer System Evaluation Criteria (the “Orange Book”) tend to have stronger assurance.Security controls also must constrain authorized users more closely toward what is considered acceptable behavior such as through a combination of noncompromisable mandatory controls (levels and categories0 and discretionary controls, with systematic use of rules and least privilege, plus anomaly detection systems that can detect potentially undesirable behavior. Real-time anomaly detection can contribute to narrowing Gap 1 by detecting deviations from accepted computer-system norms and Gap 3 by detecting deviations from accepted social norms.  

Reference : Peter G. N. The computer-related risk of the year:computer abuse.

Features of computer













Speed:
It has a very speed of executing instruction. CPU of a computer can perform more than 10 million operations per second. All the instructions are executed in accordance with a clock, whose frequency is measured in Mhz. Normally, 3-4 cycles of this clock are required to execute one instruction. Recent computers have a speed of about 300 Mhz i.e one cycle of approx.3 X 10-9 Sec. This means that it can execute an instruction in about 10 nanosec (10X 10 -8 Sec). In other words it can execute 100 million instructions in one second. But the overall speed of performance of a computer decreases due to slower Input and Output devices, interfaced to CPU.


Storage:

The speed with which computers can process large quantities of data/ information, the size of input so also the output is quite large. The size of information to be stored further increases due to graphic applications. All this information is to be stored in auxiliary memory i.e Hard Disk fitted inside the computer. Hard Disks now days have a storage capacity as large as 4 GB. The size of internal primary memory (RAM) has also been increases a lot to about 64 MB.


Accuracy:
The accuracy of results computed by a computer is consistently high. Due to digital techniques the error is very small. The errors in computing may be due to logical mistakes by a programmer or due to inaccurate data.

Reliability:
The reliability of results processed by a computer is very high. If a program is executed any number of times with the same set of data, every time the results would be the same.

Versality:
Computers are capable of performing almost task provided the task can be reduced to a series of logical steps so that an appropriate program in a suitable language can be fed to a computer memory. Ofcourse, the input and output devices should be capable of performing the desired task. Because of these capabilities, a number of processes can be automated with the help of a computer.
Apart from those outlined above, computer has some other features also. They are automatic to a great extent i.e they run with very little human interference. They can work endless at the same level of efficiency and productivity. Modern computers are becoming more and more user friendly i.e computer itself helps the user at every stage. Visual display, limited but effective use of natural language like English and appropriate software have made it very easy to operate computers.

Reference : Features of a computer. Retrieved from https://imp.center/agri/features-of-a-computer/






Types of computer







  • Supercomputer and Mainframe
Supercomputer is a broad term for one of the fastest computers currently available. Supercomputers are very expensive and are employed for specialized applications that require immense amounts of mathematical calculations (number crunching). For example, weather forecasting requires a supercomputer. Other uses of supercomputers scientific simulations, (animated) graphics, fluid dynamic calculations, nuclear energy research, electronic design, and analysis of geological data (e.g. in petrochemical prospecting). Perhaps the best known supercomputer manufacturer is Cray Research.


Mainframe was a term originally referring to the cabinet containing the central processor unit or "main frame" of a room-filling Stone Age batch machine. After the emergence of smaller "minicomputer" designs in the early 1970s, the traditional big iron machines were described as "mainframe computers" and eventually just as mainframes. Nowadays a Mainframe is a very large and expensive computer capable of supporting hundreds, or even thousands, of users simultaneously. The chief difference between a supercomputer and a mainframe is that a supercomputer channels all its power into executing a few programs as fast as possible, whereas a mainframe uses its power to execute many programs concurrently. In some ways, mainframes are more powerful than supercomputers because they support more simultaneous programs. But supercomputers can execute a single program faster than a mainframe. The distinction between small mainframes and minicomputers is vague, depending really on how the manufacturer wants to market its machines.





  • Minicomputer

It is a midsize computer. In the past decade, the distinction between large minicomputers and small mainframes has blurred, however, as has the distinction between small minicomputers and workstations. But in general, a minicomputer is a multiprocessing system capable of supporting from up to 200 users simultaneously.





  • Workstation

It is a type of computer used for engineering applications (CAD/CAM), desktop publishing, software development, and other types of applications that require a moderate amount of computing power and relatively high quality graphics capabilities. Workstations generally come with a large, high-resolution graphics screen, at large amount of RAM, built-in network support, and a graphical user interface. Most workstations also have a mass storage device such as a disk drive, but a special type of workstation, called a diskless workstation, comes without a disk drive. The most common operating systems for workstations are UNIX and Windows NT. Like personal computers, most workstations are single-user computers. However, workstations are typically linked together to form a local-area network, although they can also be used as stand-alone systems.


N.B.: In networking, workstation refers to any computer connected to a local-area network. It could be a workstation or a personal computer.





  • Personal computer


It can be defined as a small, relatively inexpensive computer designed for an individual user. In price, personal computers range anywhere from a few hundred pounds to over five thousand pounds. All are based on the microprocessor technology that enables manufacturers to put an entire CPU on one chip. Businesses use personal computers for word processing, accounting, desktop publishing, and for running spreadsheet and database management applications. At home, the most popular use for personal computers is for playing games and recently for surfing the Internet.

Personal computers first appeared in the late 1970s. One of the first and most popular personal computers was the Apple II, introduced in 1977 by Apple Computer. During the late 1970s and early 1980s, new models and competing operating systems seemed to appear daily. Then, in 1981, IBM entered the fray with its first personal computer, known as the IBM PC. The IBM PC quickly became the personal computer of choice, and most other personal computer manufacturers fell by the wayside. P.C. is short for personal computer or IBM PC. One of the few companies to survive IBM's onslaught was Apple Computer, which remains a major player in the personal computer marketplace. Other companies adjusted to IBM's dominance by building IBM clones, computers that were internally almost the same as the IBM PC, but that cost less. Because IBM clones used the same microprocessors as IBM PCs, they were capable of running the same software. Over the years, IBM has lost much of its influence in directing the evolution of PCs. Therefore after the release of the first PC by IBM the term PC increasingly came to mean IBM or IBM-compatible personal computers, to the exclusion of other types of personal computers, such as Macintoshes. In recent years, the term PC has become more and more difficult to pin down. In general, though, it applies to any personal computer based on an Intel microprocessor, or on an Intel-compatible microprocessor. For nearly every other component, including the operating system, there are several options, all of which fall under the rubric of PC


Today, the world of personal computers is basically divided between Apple Macintoshes and PCs. The principal characteristics of personal computers are that they are single-user systems and are based on microprocessors. However, although personal computers are designed as single-user systems, it is common to link them together to form a network. In terms of power, there is great variety. At the high end, the distinction between personal computers and workstations has faded. High-end models of the Macintosh and PC offer the same computing power and graphics capability as low-end workstations by Sun Microsystems, Hewlett-Packard, and DEC. 





  • Tower model

The term refers to a computer in which the power supply, motherboard, and mass storage devices are stacked on top of each other in a cabinet. This is in contrast to desktop models, in which these components are housed in a more compact box. The main advantage of tower models is that there are fewer space constraints, which makes installation of additional storage devices easier. 




  • Desktop model

A computer designed to fit comfortably on top of a desk, typically with the monitor sitting on top of the computer. Desktop model computers are broad and low, whereas tower model computers are narrow and tall. Because of their shape, desktop model computers are generally limited to three internal mass storage devices. Desktop models designed to be very small are sometimes referred to as slimline models.




  • Notebook computer

An extremely lightweight personal computer. Notebook computers typically weigh less than 6 pounds and are small enough to fit easily in a briefcase. Aside from size, the principal difference between a notebook computer and a personal computer is the display screen. Notebook computers use a variety of techniques, known as flat-panel technologies, to produce a lightweight and non-bulky display screen. The quality of notebook display screens varies considerably. In terms of computing power, modern notebook computers are nearly equivalent to personal computers. They have the same CPUs, memory capacity, and disk drives. However, all this power in a small package is expensive. Notebook computers cost about twice as much as equivalent regular-sized computers. Notebook computers come with battery packs that enable you to run them without plugging them in. However, the batteries need to be recharged every few hours.



  • Laptop computer

A small, portable computer -- small enough that it can sit on your lap. Nowadays, laptop computers are more frequently called notebook computers.




  • Subnotebook computer

A portable computer that is slightly lighter and smaller than a full-sized notebook computer. Typically, subnotebook computers have a smaller keyboard and screen, but are otherwise equivalent to notebook computers.




  • Hand-held computer 

A portable computer that is small enough to be held in one’s hand. Although extremely convenient to carry, handheld computers have not replaced notebook computers because of their small keyboards and screens. The most popular hand-held computers are those that are specifically designed to provide PIM (personal information manager) functions, such as a calendar and address book. Some manufacturers are trying to solve the small keyboard problem by replacing the keyboard with an electronic pen. However, these pen-based devices rely on handwriting recognition technologies, which are still in their infancy. Hand-held computers are also called PDAs, palmtops and pocket computers.




  • Palmtop

A small computer that literally fits in your palm. Compared to full-size computers, palmtops are severely limited, but they are practical for certain functions such as phone books and calendars. Palmtops that use a pen rather than a keyboard for input are often called hand-held computers or PDAs. Because of their small size, most palmtop computers do not include disk drives. However, many contain PCMCIA slots in which you can insert disk drives, modems, memory, and other devices. Palmtops are also called PDAs, hand-held computers and pocket computers.




  • PDA

Short for personal digital assistant, a handheld device that combines computing, telephone/fax, and networking features. A typical PDA can function as a cellular phone, fax sender, and personal organizer. Unlike portable computers, most PDAs are pen-based, using a stylus rather than a keyboard for input. This means that they also incorporate handwriting recognition features. Some PDAs can also react to voice input by using voice recognition technologies. The field of PDA was pioneered by Apple Computer, which introduced the Newton MessagePad in 1993. Shortly thereafter, several other manufacturers offered similar products. To date, PDAs have had only modest success in the marketplace, due to their high price tags and limited applications. However, many experts believe that PDAs will eventually become common gadgets.

PDAs are also called palmtops, hand-held computers and pocket computers.

Reference : Types of computers. Retrieved from https://www.cs.cmu.edu/~fgandon/lecture/uk1999/computers_types/