Show simple item record

dc.contributor.advisorChan-Tin, Eric
dc.contributor.authorCui, Weiqi
dc.date.accessioned2019-10-25T19:02:51Z
dc.date.available2019-10-25T19:02:51Z
dc.date.issued2019-05-01
dc.identifier.urihttps://hdl.handle.net/11244/321512
dc.description.abstractMost privacy-conscious users utilize HTTPS and an anonymity network such as Tor to mask source and destination IP addresses. It has been shown that encrypted and anonymized network traffic traces can still leak information through a type of attack called a website fingerprinting (WF) attack. The adversary records the network traffic and is only able to observe the number of incoming and outgoing messages, the size of each message, and the time difference between messages. In previous work, the effectiveness of website fingerprinting has been shown to have an accuracy of over 90% when using Tor as the anonymity network. Thus, an Internet Service Provider can successfully identify the websites its users are visiting.
dc.description.abstractMitigations to these attacks are using cover/decoy network traffic to add noise, padding to ensure all the network packets are the same size, and introducing net-work delays to confuse an adversary. Although these mitigations have been shown to be effective, reducing the accuracy to 10%, the overhead is very high. The latency overhead is above 100% and the bandwidth overhead is at least 40%. We introduce a new realistic cover traffic algorithm, based on a user's previous network traffic, to mitigate website fingerprinting attacks. In simulations, our algorithm reduces the accuracy of attacks to 14% with zero latency overhead and about 20% bandwidth overhead. In real-world experiments, our algorithm reduces the accuracy of attacks to 16% with only 20% bandwidth overhead.
dc.description.abstractOne main concern about website fingerprinting is its practicality. The common assumption in previous work is that a victim is visiting one website at a time and has access to the complete network trace of that website. However, this is not realistic. In our work, we aim to reduce the distance between the lab experiments with the realistic conditions. We propose a new algorithm based on Hidden Markov Model to deal with situations when the victim visits one website after another. After that, we employ deep learning algorithm to handle the situations when the captured traces are not perfect, such as partial traces, two-page traces or traces with background noise.
dc.formatapplication/pdf
dc.languageen_US
dc.rightsCopyright is held by the author who has granted the Oklahoma State University Library the non-exclusive right to share this material in its institutional repository. Contact Digital Library Services at lib-dls@okstate.edu or 405-744-9161 for the permission policy on the use, reproduction or distribution of this material.
dc.titleWebsite Fingerprinting Attacks
dc.contributor.committeeMemberPark, Nohpill
dc.contributor.committeeMemberCrick, Christopher
dc.contributor.committeeMemberGong, Yanmin
osu.filenameCui_okstate_0664D_16246.pdf
osu.accesstypeOpen Access
dc.type.genreDissertation
dc.type.materialText
dc.subject.keywordsanonymity
dc.subject.keywordscover traffic
dc.subject.keywordsdeep learning
dc.subject.keywordspracticality
dc.subject.keywordsprivacy
dc.subject.keywordswebsite fingerprinting
thesis.degree.disciplineComputer Science
thesis.degree.grantorOklahoma State University


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record