Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Baby examples excerises, Study notes of Mathematical Modeling and Simulation

More practice baby examples for study

Typology: Study notes

2024/2025

Uploaded on 06/05/2025

LynnWang
LynnWang 🇺🇸

13 documents

1 / 12

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Evil Random Numbers
Pseudorandom number generation (PRNG) is a crucial aspect of many computer applications,
particularly those involving simulation, cryptography, and statistical sampling. PRNGs are
algorithms that aim to produce sequences of numbers that appear random, even though they
are generated by deterministic processes. The numbers are called "pseudorandom" because
they only exhibit properties of randomness but are not genuinely random in the strictest sense.
A good PRNG is characterized by a combination of factors, including high-quality randomness,
a long period, and efficient implementation.
A bad pseudorandom number generator can have various shortcomings, which include:
Poor randomness: A PRNG should generate numbers that appear random, meaning they are
uniformly distributed and independent of one another. A bad PRNG might produce numbers
with discernible patterns or correlations, which can lead to biased outcomes in simulations or
statistical sampling, and even compromise security in cryptographic applications.
Short period: The period of a PRNG refers to the length of the sequence it generates before it
starts repeating. A bad PRNG will have a short period, which can be problematic in applications
that require a long sequence of random numbers. Repeated sequences can introduce patterns
and correlations that affect the results of the application.
Predictability: A secure PRNG should be difficult to predict even if an attacker has knowledge of
previous outputs. If a PRNG is easy to predict, it can be exploited in cryptographic applications,
leading to potential security breaches. A bad PRNG might have weak algorithms or insufficient
entropy sources, making its output predictable to some extent.
Poor seeding: The initial value used to start the generation of pseudorandom numbers is called
the seed. A bad PRNG might have a weak seeding mechanism, resulting in the same sequence
of numbers being generated across different instances or applications. This can have serious
implications for security and randomness.
Inefficient implementation: A PRNG should be computationally efficient to generate random
numbers quickly. A bad PRNG might be slow or resource-intensive, which can hinder
performance in applications where large quantities of random numbers are needed rapidly.
Lack of cryptographic strength: Some PRNGs are designed specifically for cryptographic
applications, where the quality of randomness and unpredictability are paramount. A bad PRNG
might not meet the stringent requirements of these applications, potentially compromising the
security of sensitive data and communications.
pf3
pf4
pf5
pf8
pf9
pfa

Partial preview of the text

Download Baby examples excerises and more Study notes Mathematical Modeling and Simulation in PDF only on Docsity!

Evil Random Numbers

Pseudorandom number generation (PRNG) is a crucial aspect of many computer applications, particularly those involving simulation, cryptography, and statistical sampling. PRNGs are algorithms that aim to produce sequences of numbers that appear random, even though they are generated by deterministic processes. The numbers are called "pseudorandom" because they only exhibit properties of randomness but are not genuinely random in the strictest sense. A good PRNG is characterized by a combination of factors, including high-quality randomness, a long period, and efficient implementation. A bad pseudorandom number generator can have various shortcomings, which include: Poor randomness: A PRNG should generate numbers that appear random, meaning they are uniformly distributed and independent of one another. A bad PRNG might produce numbers with discernible patterns or correlations, which can lead to biased outcomes in simulations or statistical sampling, and even compromise security in cryptographic applications. Short period: The period of a PRNG refers to the length of the sequence it generates before it starts repeating. A bad PRNG will have a short period, which can be problematic in applications that require a long sequence of random numbers. Repeated sequences can introduce patterns and correlations that affect the results of the application. Predictability: A secure PRNG should be difficult to predict even if an attacker has knowledge of previous outputs. If a PRNG is easy to predict, it can be exploited in cryptographic applications, leading to potential security breaches. A bad PRNG might have weak algorithms or insufficient entropy sources, making its output predictable to some extent. Poor seeding: The initial value used to start the generation of pseudorandom numbers is called the seed. A bad PRNG might have a weak seeding mechanism, resulting in the same sequence of numbers being generated across different instances or applications. This can have serious implications for security and randomness. Inefficient implementation: A PRNG should be computationally efficient to generate random numbers quickly. A bad PRNG might be slow or resource-intensive, which can hinder performance in applications where large quantities of random numbers are needed rapidly. Lack of cryptographic strength: Some PRNGs are designed specifically for cryptographic applications, where the quality of randomness and unpredictability are paramount. A bad PRNG might not meet the stringent requirements of these applications, potentially compromising the security of sensitive data and communications.

Evil Random Numbers Example R Code

The random number generation techniques employed in the code are: Box-Muller method: This method generates normally distributed random numbers from uniformly distributed random numbers by performing coordinate transformations. It requires two independent uniform random numbers (u1 and u2) to generate two normally distributed random numbers (z0 and z1). All of the following generate the uniform random variates that are then transformed using the Box-Muller to create the plots. Built-in RNG (runif_wrapper): This function is a wrapper for R's built-in runif() function, which generates uniformly distributed random numbers. R's runif() function uses the Mersenne-Twister algorithm, a widely used and well-tested pseudorandom number generator with good statistical properties. The Mersenne-Twister algorithm provides a long period and high-quality random numbers, making it suitable for a wide range of applications.. Linear Congruential Generator (LCG): The LCG is a simple method to generate random numbers using a linear equation with specified parameters (lcg_a, lcg_c, and lcg_m). The generated numbers have a uniform distribution. The code includes various LCG configurations: one good "desert island" LCG and four bad LCGs. Xorshift RNG: The Xorshift algorithm is a fast and efficient method for generating random numbers using bitwise operations (bitwise XOR and bitwise shift). The algorithm has better statistical properties than LCGs and is known for its simplicity and speed. Tausworthe RNG: The Tausworthe method is a linear feedback shift register-based random number generator. It generates a binary sequence by XORing the bits at specific positions in the register and uses these bits to form the random number. The code generates 5,000 random numbers using each of these techniques and visualizes the distributions using ggplot2. Each generated dataset is plotted on a 2D plane to visually compare the distributions, revealing the quality of the random number generation. The random numbers are transformed using Box-Muller. When examining the plots, notice the built in R functionality and the “desert island” LCG look like scattered clouds of points. The bad LCGs all have some sort of pattern to them, especially LCG 3 and LCG 4. A zoomed in plot is provided at the end to see even more of the patterns present in Bad LCG 1 and Bad LCG 2.

xorshift32 <- function (state) { state <- bitwXor(state, bitwShiftL(state, a)) state <- bitwXor(state, bitwShiftR(state, b)) state <- bitwXor(state, bitwShiftL(state, c)) return(state) } random_integers <- integer(n) state <- seed for (i in 1 :n) { state <- xorshift32(state) random_integers[i] <- state } random_numbers <- as.double(random_integers %% ( 2 ^ 31 - 1 )) / ( 2 ^ 31 - 1 ) return(random_numbers) } tausworthe <- function (n, seed, r, q, l) { # [Tausworthe Code Redacted] } n <- 5000 # Good random number generator (R's built-in) builtin <- box_muller(n, runif_wrapper, seed = 1 ) # Good "desert island" LCG desert <- box_muller(n, function (n, seed) lcg(n, seed = 1 , lcg_a = 16807 , lcg_c = 0 , lcg_m = 2 ^ 31 - 1 )) # Bad LCG 1 bad_lcg1 <- box_muller(n, function (n, seed) lcg(n, seed = 1 , lcg_a = 2147483587 , lcg_c = 1664525 , lcg_m = 2 ^ 32 )) # Bad LCG 2 bad_lcg2 <- box_muller(n, function (n, seed) lcg(n, seed = 2 , lcg_a = 25214903917 , lcg_c = 115 , lcg_m = 2 ^ 48 )) # Bad LCG 3 bad_lcg3 <-

box_muller(n, function (n, seed) lcg(n, seed = 3 , lcg_a = 3 , lcg_c = 1 , lcg_m = 2 ^ 31 )) # Bad LCG 4 bad_lcg4 <- box_muller(n, function (n, seed) lcg(n, seed = 4 , lcg_a = 5 , lcg_c = 3 , lcg_m = 2 ^ 31 )) # Xorshift RNG xor <- box_muller(n, xorshift, seed = 1 ) # Tausworthe RNG taus <- # [Tausworthe Code Redacted] plot_list <- list(builtin = builtin, desert = desert, bad_lcg1 = bad_lcg1, bad_lcg2 = bad_lcg2) colors <- c("purple", "blue", "red", "orange") titles <- c("Built-in RNG", "Desert Island LCG", "Bad LCG 1", "Bad LCG 2") plots <- lapply( 1 :length(plot_list), function (i) { p <- ggplot() + create_4_quadrant_axis() + geom_point(data = data.frame(x = plot_list[[i]]$x, y = plot_list[[i]]$y), aes(x = x, y = y), col = colors[i], alpha = 0.2) + labs(title = titles[i], x = "x", y = "y") + theme_minimal() + theme(plot.title = element_text(size = 16 ), axis.title = element_text(size = 14 ), axis.text = element_text(size = 12 )) return(p) }) grid.arrange(grobs = plots, ncol = 2 )

Here is a zoomed in version of the first graph to better see the two bad LCGs.

Evil Random Numbers Example Python Code

Python Notebook Note that the “Bad LCG2” looks better in Python due to it using 64-bit data types.

Queues ‘R’ Us

Kendall's notation is a widely used system for describing queueing systems. It consists of three parts: A/B/C, where A refers to the distribution of interarrival times, B refers to the distribution of service times, and C refers to the number of servers in the system. For the M/M/1 queue, the Kendall's notation is M/M/1, where "M" indicates that the interarrival times and service times follow an exponential distribution, and "1" indicates that there is only one server in the system. Therefore, the M/M/1 queue is a specific type of queueing system that assumes a Poisson arrival process and an exponential service distribution with a single server.

a desired level of risk and return. A portfolio with a higher mix of stocks with higher volatility will have a higher risk and potential for higher returns, while a portfolio with a lower mix of high- volatility stocks will have lower risk and potential for lower returns. Simulation allows for the analysis of various scenarios and the estimation of the probability of certain outcomes. This can be useful in making investment decisions and managing risk. For example, simulation can be used to estimate the probability of a portfolio experiencing a certain level of loss or achieving a certain level of return over a certain period of time. One limitation of simulation is that it is based on assumptions and models that may not accurately reflect the real-world behavior of the stock market. In addition, simulation requires the specification of various input parameters, such as the mean and standard deviation of stock returns, which may be difficult to estimate accurately. Furthermore, simulation can be computationally intensive and may require significant computing resources for large portfolios.

Stock Market Follies Code

Please see the Excel or Visual Basic based simulation provided in the course learning management system in Module 1.

Taking a Random Walk

Random walk is a mathematical model that describes the behavior of a variable that changes randomly over time. In the context of taking a normal step up or down every time unit, it means that at each time step, the variable takes a random value that can be higher or lower than the current value by a certain amount, which is determined by a probability distribution. As the time steps become smaller and the number of steps becomes larger, random walk converges to Brownian motion, which is a continuous-time stochastic process that describes the random movement of particles in a fluid or gas. In Brownian motion, the position of a particle changes continuously over time, and its velocity is proportional to the square root of time. Einstein was the first to provide a mathematical explanation for Brownian motion in 1905, which helped to confirm the existence of atoms and molecules. Black and Scholes, on the other hand, applied the concept of Brownian motion to develop the Black-Scholes model, which is a widely used method for pricing options. Their contributions to the field of finance earned them the Nobel Prize in Economics in 1997.

Taking a Random Walk Example R Code

This code generates a random walk with 100 steps, where the direction of each step is determined randomly by sampling from a set of values that only contains -1 and 1. The set.seed(123) line is used to ensure that the same sequence of random numbers is generated each time the code is run. The cumsum() function is used to calculate the cumulative sum of the steps, which gives the position of the random walk at each time step. The resulting vector of positions is stored in the variable random_walk. Finally, the code creates a plot of the random walk using the plot() function. The resulting plot shows the position of the random walk over time. # Set seed for reproducibility set.seed( 123 ) # Generate random walk steps <- sample(c(- 1 , 1 ), size = 100 , replace = TRUE) random_walk <- cumsum(steps) # Plot random walk plot(random_walk, type = "l", xlab = "Time", ylab = "Position")