Publication:
Discrepancy Bounds for Deterministic Acceptance-Rejection Samplers

dc.contributor.advisor Dick, Josef en_US
dc.contributor.advisor Kuo, Frances en_US
dc.contributor.author Zhu, Houying en_US
dc.date.accessioned 2022-03-22T12:28:23Z
dc.date.available 2022-03-22T12:28:23Z
dc.date.issued 2016 en_US
dc.description.abstract The Monte Carlo method is one of the widely used numerical methods for simulating probability distributions. Its convergence rate is independent of the dimension but slow. Quasi-Monte Carlo methods, which can be seen as a deterministic version of Monte Carlo methods, have been developed to improve the convergence rate to achieve greater accuracy, which partially depends on generating samples with small discrepancy. Putting the quasi-Monte Carlo idea into statistical sampling is a good way to improve the convergence rate and widen practical applications. In this thesis we focus on constructing low-discrepancy point sets with respect to non-uniform target measures using the acceptance-rejection sampler. We consider the acceptance-rejection samplers based on different driver sequences. The driver sequence is chosen such that the discrepancy between the empirical distribution and the target distribution is small. Hence digital nets, stratified inputs and lattice point sets are used for this purpose. The central contribution in this work is the establishment of discrepancy bounds for samples generated by acceptance-rejection samplers. Together with a Koksma-Hlawka type inequality, we obtain an improvement of the numerical integration error for non-uniform measures. Furthermore we introduce a quality criterion for measuring the goodness of driver sequences in the acceptance-rejection method. Explicit constructions of driver sequences yield a convergence order beyond plain Monte Carlo for samples generated by the deterministic acceptance-rejection samplers in dimension one. The proposed algorithms are numerically tested and compared with the standard acceptance-rejection algorithm using pseudo-random inputs. The empirical evidence confirms that adapting low-discrepancy sequences in the acceptance-rejection sampler outperforms the original algorithm. en_US
dc.identifier.uri http://hdl.handle.net/1959.4/56305
dc.language English
dc.language.iso EN en_US
dc.publisher UNSW, Sydney en_US
dc.rights CC BY-NC-ND 3.0 en_US
dc.rights.uri https://creativecommons.org/licenses/by-nc-nd/3.0/au/ en_US
dc.subject.other Quasi-Monte Carlo method. en_US
dc.subject.other Acceptance-rejection sampler. en_US
dc.subject.other Low-discrepancy sequence. en_US
dc.title Discrepancy Bounds for Deterministic Acceptance-Rejection Samplers en_US
dc.type Thesis en_US
dcterms.accessRights open access
dcterms.rightsHolder Zhu, Houying
dspace.entity.type Publication en_US
unsw.accessRights.uri https://purl.org/coar/access_right/c_abf2
unsw.identifier.doi https://doi.org/10.26190/unsworks/19043
unsw.relation.faculty Science
unsw.relation.originalPublicationAffiliation Zhu, Houying, Mathematics & Statistics, Faculty of Science, UNSW en_US
unsw.relation.originalPublicationAffiliation Dick, Josef, Mathematics & Statistics, Faculty of Science, UNSW en_US
unsw.relation.originalPublicationAffiliation Kuo, Frances, Mathematics & Statistics, Faculty of Science, UNSW en_US
unsw.relation.school School of Mathematics & Statistics *
unsw.thesis.degreetype PhD Doctorate en_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
public version.pdf
Size:
2.92 MB
Format:
application/pdf
Description:
Resource type