Noise control and Radon


Traffic in residential places poses health hazards to the inhabitants. Urbanization and development has continued to push for more roads and expansion of the existing ones to accommodate several vehicles on the road. Heavy commercial vehicles and buses make a lot of noise that cannot be recommended near homes. City planning should follow certain policies to minimize those health hazards and gain more approval for expansion of freeways.

There are many health hazards as a result of noise pollution. According to Goines and Louis (2007), it can cause hearing impairment. The large noises of traffic damage our hearing capability gradually. Noise also disturbs sleeps. The effect is devastating with young, old and people with sleeping problems. The nature of traffic noise is that it is not uniform. It comes in all sorts of ways that can sometime wake people up.  Cardiovascular problems have been associated with noise. Noise according to a research, triggers the body to produce Adrenaline hormones. Frequent changes of the rat of the heartbeat as a result of noises can eventually cause damage to the heart. Cognitive ability is adversely affected by noise. Noise has been associated with poor school performance through concentration, attention and social emotional development. Besides people suffering exposed to high noise are easily annoyed, an effect that after some time becomes permanent.

The city council can adopt some policies to control the noise. Some of the policies I recommend include distance restrictions. They can restrict building of residential houses or encroaching of the freeway expansion to a certain distance from residential houses. This distance should be safe to reduce the noise to a safe level. The county council can also restrict the use of road by time. They can allow usage only by day thereby reducing disturbances by night. Most freeways have a controlled slip way as the entrance, making it possible to control traffic. They council can also restrict loud vehicles within a certain time.


Radon is a naturally occurring gas from beneath the earth. Kansas State University explains that radon comes from uranium deep inside the Earth. The rate of escape into the atmosphere depends on the cracks in the rocks. Radon gas is an inert gas and electrically uncharged but radioactive. The radioactive gas would change to atoms called a radon progeny. These particles are electrically charged and would attach themselves to dust particles. When those particles are inhaled, the radon progeny atoms are inhaled alongside. After getting into the lungs, it adheres to the lung linings. It then radiates to change into other atoms. The radiation has a very short radius. The radiation, however, is responsible for lung cancer. As long as the threat does not extend to other organs, it is very dangerous to the lung (Kansas State University, n.d).

Research is ongoing to establish the severity of radon caused cancer. Documented results argue that most deaths have had smoking also. However, according to further research, radon would have led to their deaths even if you they were not smoking. Radon in residential houses is much lower than in mines. It is sometimes suspected to be of minimal influence. However, it is linked to several deaths of smokers appearing dangerous to people who are already smoking (WHO, 2002).

There are several measures one can take if they realize they are living in a radon leaking region. We should realize, however, that radon leaking is not constant. Radon can vary in intensity within adjacent houses. The first best option is to leave a house with radon leakage. This in case you are a smoker since it is a potential source of health hazard. The second option is maintaining the house and improving ventilation. The house cracks and naked building stones should be covered completely. The ceiling should also be covered. The second step is to increase ventilation. People living in radon leaking areas can also use a machine called a radon sump system. The second option is to install a pressurized ventilation system (WHO, 2002).




Goines, L. and Louis, H. (2007). Noise pollution; a modern plague. Southern medical journal. 100. 287-294

Kansas State University. (n.d). The health effect of exposure to indoor radon. Retrieved from

WHO. (2002). Radon and Health. Available from

Film Anaylsis on “Tuesdays with Morrie.”

The specific grief, loss, or bereavement issue facing the main character, Morrie Tuesdays with Morrie is the battling o the fatal illness, which leads him to use the death prospect to teach how to live in the world. The title character of Tuesdays with Morrie has spent the vast majority of his life as an educator of humanism at Brandeis University, a position he has fallen into just “as a matter of course.” He is a brilliant instructor, and resigns when he starts to lose control of his body to ALS, Amyotrophic Lateral Sclerosis, otherwise called Lou Gherig’s sickness. The ailment attacks his body, at the same time, humorously, leaves his brain as clear as ever. He understands that his time is running out, and that he must share his insight on “The Meaning of Life” with the world before it is past the point where it is possible to do as such. Mitch serves as a vehicle through which he can pass on this shrewdness, to Mitch actually, and, all the more by implication, to a bigger gathering of people which he comes to after his demise by method for the book itself (Tuesdays With Morrie,1999).

He and Mitch arrangement for the book amid his diminishing days, esteeming it their “last proposal together.” The kind of grief, loss, bereavement is the loss of hope in life and subsequent resolution. All through the film Morrie, constantly discusses disengaging himself from his experience, particularly when he experiences rough hacking spells. Morrie bases this hypothesis of separation, from a Buddhist logic. He feels that nobody ought to stick to anything, and that everything that exists is impermanent (Delshad, 2010).

As evident in the film, the theoretical framework which can be attached to the grief that Morrie is going through and approaches he uses in dealing with it is the Buddhist theory.  Through disengaging himself, he finds himself able to expel himself from his surroundings into his own cognizance. Along these lines he finds himself able to increase viewpoint in uncomfortable and upsetting circumstances. Be that as it may, Morrie does not utilize this technique to quit feeling or encountering; he really needs to encounter the circumstance completely. After he encounters a sure feeling he is then ready to give up and disconnect himself. He hones this frequently amid life undermining circumstances, for example, his extreme hacking spells, in light of the fact that he wouldn’t like to bite the dust furious or terrified. He withdraws himself so he can acknowledge these circumstances throughout his life thus that he will have the capacity to grasp his passing less demanding since it is drawing closer. This is the kind of grief which best characterizes the character’s grief reactions and responses.

The resolution and determination is the result of the considerable number of occasions and events in the story. Amid one of the Tuesday lessons, Morrie says to Mitch, “you talk I’ll listen” (Albom, 188); inferring that after his passing regardless he needs Mitch to converse with him pretty much as he did when he was alive. Toward the film’s end, when Mitch is at Morrie’s memorial service he took a stab at conversing with Morrie and was enjoyably amazed at how common it felt (Tuesdays With Morrie,1999). We can accept that Mitch will keep on discovering solace and direction in his life through these discussions with Morrie. Toward the film’s end we can tell that Mitch and his sibling will keep on staying in contact as opposed to before. There are several suggestions of implications for social work practice and interventions to promote adaptive coping and adjustment to the grief in this case. “Love one another or die” (Albom, 163). Morrie stresses this quote and subject all through the film. He feels that a plenitude of adoration and sympathy is the most astounding feeling of satisfaction that one can encounter. This implies the services he renders to the community is part of the social work obligation through the declaration of affection and sympathy which is critical to Morrie, particularly since he needed such expression in his childhood.

Interventions are necessary to harmonize his soul with the early demise of his mom, and his ever busy father, Morrie did not feel a feeling of adoration until Eva came into their home. Eva supported and tended to him as though he was her own youngster and Morrie conveyed this feeling of affection and sympathy with him for whatever remains of his life (Tuesdays With Morrie, 1999).  This should be ensured in the best way possible since Morrie is trying to make best out of his life. Since he was drained of affection at an early age, for whatever is left of his life he should be constantly offered adoration so that he can offer his adoration and sympathy to others. The social work practice and interventions that we learn from the film underline that interventions should center rejecting pop-culture values and standards, to develop our own sense of values. Morrie feels that popular culture and embracing a self-created value culture. Throughout his life, Morrie has been successful at rejecting this dictatorship and creating his own culture based on love, compassion, acceptance and communication. Morrie feels that the media drives greed and violence, which is then promoted by pop-culture (Katz, 2008). He was successful at reevaluating his own life and what he feels is true fulfillment. We also see how unfulfilled Mitch seems to be with his busy working life and material aspirations. Through his lessons, Morrie was able to open Mitch’s eyes to see what really fulfills one in life (Katz, 2008).










Albom, M. (2010). Tuesdays with Morrie: An old man, a young man and life’s greatest lesson (10th anniversary ed.). New York: Random House Large Print.

Delshad, Farshid (2010). Interaction of Religion, Morality and Social Work. Munich: AVM (Adademischer Verlag München).

Katz, Michael B. (2008). In the Shadow Of the Poorhouse: A Social History Of Welfare In America. New York: Basic Books.

Tuesdays With Morrie (1999). (2012). Retrieved September 18, 2015, from

Do My Homework

Science Fiction – When Science Meets Fiction


There are umpteen works of Science Fiction for Children. The last super Bowl Game by George R. R. Martin and Of Grass, and Sand by Vonda M. McIntyre Of Mist are two exemplary works from Zipes The Norton anthology of Children’s Literature. While The Last Super Bowl Game reveals the power of future computing and takes the human imagination to another level, Of Grass, and Sand is a biological science fiction that requires a deeper understanding of the scientific processes.

The Last Super Bowl Game” reveals a fiction world where a virtual simulation replaced the real game of football. The characters are so framed that seems to be two teams are featured to play the actual live game ever. It seems to be rarely fair game other than the time frame. The author has constructed the alternative realities of both consequential and entertains. The story had been responded to the fact that whole country is glued to the television set to see this as a form of entertainment just as a sports and it turned sports into TV programming. He anticipated a lot of thing that can happen. The gaming arena was so cleverly originated as the match is placing in front of our eyes on real playground instead on a computer stimulated arena. The players are all replaced by pixels and simulation. This futuristic tale presented by Martin in 1975 in which computerized simulations have been taken over the sports scene, firstly Major league Baseball, then hockey, the National Basketball association and finally the National Football League .Computers were made the exclusive home to arenas by the year 2016 when the last Super Bowl was played.

Sports is full of arguments as to who is the winner , with how many runs ,who is awarded the best player etc. but these arguments got faded when all the answers were fed in the computer with full database of that particular game with year. A new fed in sports grew up and synthesized the computer to grow more in power and ability of lifelike images. The simulated sports had become a national obsession. This story was written when the computer was unheard in the houses and was themselves a fantasy. This is historic in many ways but especially in field of sports when the development of artificial intelligence and advancement in electronics.

Martin’s vision was not perfect but had seen the power of future computing. How did he know that invasive computer technology would be in our lives? The early computing was lacking in depth as the game was presented on flat screen , colorless predictions seems to be alright but not compared to actual watching in real games. Wherein much more simulation came up with the modern video game technology with the help of computer. The digital special effects with stop go animations, models and puppets movies like Jurassic park, the Phantommenace, Avatar. In 2012, the sports games have high definition graphics with a thick playbook. The players don’t only look real but their skin tones have also been matched up so as to look like their miniatures.

For food, riches, land and glory mankind is competing for its inception. Computers offer a fun to simulate the past but cannot factor the variable caused due to human error. Sport video games are not only watched but are played by human against human for worldwide competition for significant prize money As for a real it’s going stronger.2011 world series having a viewer of more than 24.5 million and the Super bowl watched by more than billion of people worldwide not as in Martin’s story where 843 leave early due to heavy raining.

Although the assessment of Martin of future technology is dead on the real thing status is less so. This story of science fiction is less about technology and more the reminder of indelible qualities of humanity pervasion technology on creativity and invention, socialization and invention and the exploration of our mental and physical limits. The attention given by him to the real product is done equal to his version of development of simulated product, that only a true sports fan can describe the misery and joy and sacrifice of his favorite team.

In ‘Of Mist, and Grass, and Sand’ the biotechnology is far more advanced than in today’searth. This is a first biology based science fiction. As per author the earth is in post-apocalypticfuture,socially and scientifically being much different from modernization.

The novel is plotted in a very clever, simple language, a quick moving prose, lyrically describing the intense landscape passages which take the reader straight into its half strange desert, halffamiliar world, and finely described characters depicting their emotional moods and changes. This science fiction allows portraying a very wonderful strong character of a female in comparison to a weak male one.

In starting the setting was different as the care provided in the story is totally unorthodox. In this science fiction story the healer is experiencing in a totally similar way is remarkably same as everyday’s wear and tear of the modern medical science practice.Snake, a young itinerant female healer has been asked to the savethe life of a boy. Her attempts for curing him, having an interaction with the boy and his family and community , and the tools which she is using for her trade as ‘the snake-grass, mist and sand’ have been detailed in the story. This strong and complex character also has to deal with the professional development issues which are interfering, truth telling and ignorant family members, also the reprobation by her teachers and peers having self-sacrifice.

The author introduces us with her protagonist snake, which uses her knowledge and her genetically altered snakes to cure illness and sufferings of people. Snake is called b y a family to cure a disease of a little boy suffering from tumor. In order to comfort him she leaves her dreamsnake Grass on child’s pillow and in the meanwhile she prepares a Mist, her cobra for treating the child. After returning from strenuous night by converting her Mist venom to medicine to cure cancer, finds that his parents have killed the snake her dreamsnake in terror of snakes out of their desert bred. Now Snake becomes handicapped as a healer without her dreamsnake whose bite eases death. She becomes afraid as to how to treat people without her dreamsnake. Due to scarcity of dreamsnakes she quenches her thirst for dreamsnake and her career as a healer.

Dreamsnake is a small snake whose venom is capable of inducing torpor and hallucination in humans, akin as produced by drugs like heroine or LSD. According to author, the biology fiction the post apocalyptic future has been predicted and in today’s world we can see that development in biotechnology – by cloning, mutation, altering sex etc.












Zipes’s The Norton Anthology of Children’s Literature (Alphabets and Primers and Readers)

Lukens’s A Critical Handbook of Children’s Literature (Picture Books)









The Design of a Windows Operating System Performance Evaluation and Analysis Guidelines




The design of a windows operating system performance evaluation and analysis guidelines is a process of coming up with policies and procedures that will be used to determine the best ways of ensuring that Windows Operating Systems are optimized for best performance (IFIPTM (Conference) et al., 2015).

In the modern world, new advancements in software technologies are creating a competitive edge for newer operating systems and there is a need for improvements in the existing operating systems. The main aim of designing a windows operating system performance evaluation and analysis guidelines is to ensure efficient use of the operating system and the hardware components of the computer (IFIPTM (Conference) et al., 2015).

The main function of the windows operating system is to provide an interface among the hardware devices so that they can function as a single unit. Windows operating systems are the most widely used operating systems worldwide and designing performance evaluation and guidelines would help to satisfy the needs of the diverse end users across the world (IFIPTM (Conference) et al., 2015).

Windows Operating Systems File System Management

Windows operating system file system management is the process of organizing and keeping track of all the files on the computer hard disk. Windows operating system file management helps to separate data into single units with a unique name making it easier to identify and separate information (DASFAA (Conference : Database systems) et al., 2015)

The main requirement for windows operating systems is to provide a convenient, efficient and robust file system for users to store, retrieve and manipulate data. An effective windows operating system file management includes folders and backup strategies. The folder directory structure should ensure that it easier to store and backup files easily and safely (DASFAA (Conference: Database systems) et al., 2015)

Within the folder structure the files are arranged based on their unique structures such as file type, file access mechanisms and space allocation among others. There are different file types used in the operating system file management. Examples of the file types include ordinary files, directory files and special files (Rajgarhia & Gehani, 2010).

In the modern computer science world, windows operating file management is done via artificial intelligence. The use of intelligence agent technology is applied in industrial and commercial domains whereby a case based reasoning technique is applied in analogic reasoning and problem solving such as design, planning and classification among others.  The intelligence agent technology seeks to deal common file management problems related to deletion, temporal files and permanent files (Rajgarhia & Gehani, 2010).

The latest trend in operating system file management is the use of data center file management systems. A data center is an operating system file management infrastructure that uses networks, servers and big data storage tools that are managed at a commanding center. At the commanding center, the data is highly protected from damages that include natural disasters among other threats (Rajgarhia & Gehani, 2010).

Methods Used In File System Management

According to Calder et al., (2011), one of the methods used in file system management is volume management.  Volume management is the highest level of file system organization that contains partitions that are logical divisions on the computer’s hard disk. Operating system data is stored on volumes within the partitions on the hard disk.

The second method used in file system management is fast file system recovery.  The fast system recovery method is designed with the ability to recover files as quickly as possible in the event of an unscheduled outage. This method uses a technique of lengthy file system check that makes is possible to recover and repair any data inconsistencies due to outages and unplanned system failure (Calder et al., 2011).

The third method used in file system management is shared file system support which uses a distributed over a network infrastructure whereby users at different location and platforms can be able to access the resources whenever their system are configured to access the shared files and enough permissions granted to the client computers to access the shared resources. This is one of the most convenient methods in file system management because users are able to access data at their convenience (Calder et al., 2011).

Another method used in file system management is file system mounting. File system mounting is a process of creating a root file system in the root directory. A directory tree is created is created under the root.  Mounting takes place at a special location called the mount point. After the process of mounting is complete, the previous data available at the mount point becomes inaccessible (Calder et al., 2011).

The windows file manager is another method used in file system management. The file system manager is implemented on the graphical user interface where users are allowed to interact with the files and folders from diverse window versions.  In the file manager, the hierarchical folder structure is displayed in in a left window whereas the folder contents are shown in the right pane. Users are given permissions depending following the IT policies and procedures of the business. The permissions would then enable them to copy, delete, rename or even print the files and folders (Reeder et al., 2008).

File Allocation Table (FAT) is another method used in the implementation of file system management. FAT is a traditional method is used file system management that was first introduced on the MSDOS platform. In FAT, data is stored in sectors which are further grouped into clusters which are the data allocation units on the hard drive. The windows operating system detects clusters are used for each file (Reeder et al., 2008).

In addition to this, the FAT table is used store information about the relationships between the clusters and files. This means that each cluster contains two entries in the FAT table. Over time a variety of FAT versions have been developed which include FAT 12, FAT16, and FAT32. The later newer versions were specifically developed for newer operating systems through they support backward compatibility for earlier versions (Reeder et al., 2008).

There are several advantages of use of FAT in file system management. One of the advantages is that it supports efficient use of the hard disk space. IN FAT, large files are not implemented in contiguous clusters and can be placed anywhere they fit in the cluster. Secondly, FAT supports file name characters of up to 255 unlike other versions that only support 8 characters for file names (Burguera et al., 2011).

Windows File System Management in Windows XP

Windows XP uses an updated version of FAT called FAT 32 file system that stores files in hard disk clusters with default size as small as 4KB. Windows XP operating system also supports FAT16 and NTFS file systems (Burguera et al., 2011).

FAT32 and FAT16 are used in Windows XP when setting up dual boot with previous operating systems and therefore provides for backward compatibility. NTFS on the other hard provides file and folder security in addition to file compression. FAT32 is the preferred file system because in this file system, space is used more efficiently; it is more robust and flexible where compared to FAT16 and NTFS (Burguera et al., 2011).

Windows XP file system management is done using a tool called windows explorer.  Some of the options available in windows explore for file management include full screen vie, auto hide, sorting, details view, renaming copying and moving folders among others. Another file management tool used in windows XP is My Computer which allows users to access all the files on the computer. The My Computer icon is found on the desktop of the computer (Burguera et al., 2011).

My Documents in addition to all the other windows folders are also used to manage the files. Other tools include My Pictures and My Music folder. Lastly, the command prompt is a Windows XP file management tool that uses MSDOS commands to manage files and folders on the Windows XP operating system (Reeder et al., 2008).

Windows File System Management in Windows 7

Windows 7 operating system uses NTFS file system to store files on the computer’s hard drive. The main reason why NTFS is the most preferred file system for Windows 7 is that any inherent disk related errors are recovered automatically and therefore the performance of the windows operating system is not disrupted by system crashes (IFIPTM (Conference) et al., 2015).

Secondly, NTFS supports of large capacity hard disks, significantly increasing the storage space and computer hardware performance.  Thirdly, the NTFS file system has improved security features such as use of permissions and encryption that computer administrators can use to restrict access to certain confidential or system data that can be accidentally deleted or accessed (IFIPTM (Conference) et al., 2015).

The file system management features in Windows 7 include, the windows explorer, libraries, document management, folder structure, folder addressees and naming convention. The windows explorer is an accessory that stores all the files, folders and subfolders in a hierarchical structure. The hierarchical structure consists of the root drive, parent folder and subfolders which can be expanded or collapsed to vie or hide the contents of the folder, subfolder or drive (IFIPTM (Conference) et al., 2015).

Significance of Windows Operating System File System Management in Computer Science and Industry

Windows operating system file system management is important in the field of computer science and industry for a number of reasons. Firstly, file system management is applied in several artificial intelligence technologies. One of the technologies that apply file system management is the case base reasoning (CBR) (IFIPTM (Conference) et al., 2015).

Case base reasoning is used in industrial and commercial systems to solve logical problems based on the knowledge of the past experiences. Specific areas of industrial application include design, planning and classification. Furthermore more, case based reasoning (CBR) is used to determine temporal and permanent files. This technology is also referred to as agent oriented technology in computer science (Rajgarhia & Gehani, 2010).

File system management is applied in hardware device management. All the devices connected to a computer system are managed using file system management. Windows operating system communicates to all the hardware devices connected to the system using device drivers. The device drivers act as an interface that is used to detect the electrical signals from the hardware components connected to the computer and the windows operating system (Rajgarhia & Gehani, 2010).

In computer science, a distributed file system is used in industry whereby users are able to access resources stored on centralized systems at their convenience. In distributed systems, servers are used to store information from a shared disk file system that allows for access control and translations for users in different locations. Servers use storage area networks (SAN) which provides protection information and directory service among others. On the storage area network (SAN) all the files have a unique identifier within the file system (Rajgarhia & Gehani, 2010).

Redundant Array of Inexpensive Disks (RAID) is another file system management application in computer science. RAID is used by Information technology organizations as a strategy for fault tolerance and backup. In RAID, data is organized into multiple disks that are integrated into one high performance logical disk. When accessing information stored on the RAID disks, disk arrays are used to stripe data across the multiple disks. The main advantage of RAID is that one can be able to access large amounts of data at very higher data transfer rates (Calder et al., 2011).



Fundamental Elements of Windows Operating System Performance Evaluation and Analysis Guidelines

The Main fundamental elements of windows operating system performance evaluation and analysis is to seek a technical understanding of the processes and tools used to optimize the performance of the windows operating system. According to the general guidelines used to improve the performance of the operating systems, there are a number of recommendations proposed to increase the performance of the windows operating systems (Calder et al., 2011).

The recommendations propose a number of tools to use for operating system optimization. The first tool proposed is installing the latest BIOS updates released by hardware manufacturers. The manufacturers of computer hardware frequently release BIOS updates and upgrades that significantly improve the performance of windows operating systems (Calder et al., 2011).

Secondly, it proposes the installation of the latest storage network drivers (SAN), network adapter drivers and firmware available on the manufacturer’s websites in order to optimize operating system interaction or communication with all the hardware devices available on the computer system and network infrastructure (Calder et al., 2011).

The windows operation evaluation and analysis guidelines should propose ways of optimizing the windows operating system for peak performance. Some of the operations required for a computers peak performance include cleaning up disk errors. Disk errors are a result of crashing programs that create errors on the hard disk and a result, slow down the performance of the windows operating system (Calder et al., 2011).

Disk clean programs are used to check and clean any errors of the hard disk. Secondly, the guidelines should propose frequent removal of temporal files because they also slow down the computer performance. Others include disk defragmentation, installation of antivirus programs, and Microsoft windows update among others (Calder et al., 2011).

Future Direction of Windows Operating System Performance Evaluation and Analysis Guidelines

The future direction of windows operating system performance evaluation and analysis guidelines relies on the emerging technological trends in improving the performance of the latest operating systems in the market. A performance evaluation of the latest windows operating system reveals major improvements in the use of computer system software and hardware components (Burguera et al., 2011).

Windows 7 and 8 consists of several new features that have made performance evaluation and analysis more accurate and reliable. Major improvements have also been realized in the CPU performance, memory management, hard disk drive management and network performance. These new innovative windows operating system performance evaluation techniques have a major impact on the use of windows operating system (Burguera et al., 2011).

Going beyond windows 8, there are several steps that have been taken Microsoft to improve the performance of the windows operating systems. According to performance evaluation and analysis done on some of the application developed alongside windows operating system, Microsoft has gone a step further by unifying users with cloud profile such twitter, Facebook, linked and Whatsup among many others. The development of the cloud operating systems is a major breakthrough in making easier for users to interact with windows operating systems using all technological gadgets such as mobile phones, tablets and smartphones (Burguera et al., 2011)




Burguera, I., Zurutuza, U., & Nadjm-Tehrani, S. (2011, October). Crowdroid: Behavior-Based Malware Detection System for Android. In Proceedings of the 1st ACM Workshop on Security and Privacy in Smartphones and Mobile Devices(pp. 15-26). ACM.

Calder, B., Wang, J., Ogus, A., Nilakantan, N., Skjolsvold, A., McKelvie, S., … & Rigas, L. (2011, October). Windows Azure Storage: a highly available cloud storage service with strong consistency. In Proceedings of the Twenty-Third ACM Symposium on Operating Systems Principles (pp. 143-157). ACM.

DASFAA (Conference : Database systems), In Renz, M., In Shahabi, C., In Zhou, X., & In Cheema, M. A. (2015). Database Systems for Advanced Applications: 20th International Conference, DASFAA 2015, Hanoi, Vietnam, April 20-23, 2015, Proceedings.

IFIPTM (Conference), In Damsgaard, J. C., In Marsh, S., In Dimitrakos, T., & In Murayama, Y. (2015). Trust management IX: 9th IFIP WG 11.11 International Conference, IFIPTM 2015, Hamburg, Germany, May 26-28, 2015, Proceedings.

Rajgarhia, A., & Gehani, A. (2010, March). Performance and Extension of User Space File Systems. In Proceedings of the 2010 ACM Symposium on Applied Computing (pp. 206-213). ACM.

Reeder, R. W., Bauer, L., Cranor, L. F., Reiter, M. K., Bacon, K., How, K., & Strong, H. (2008, April). Expandable Grids for Visualizing and Authoring Computer Security Policies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1473-1482). ACM.