the new kinematics don't seem to help :(

This commit is contained in:
hiro98 2020-04-27 09:58:28 +02:00
parent f2796ad3c1
commit eeae315954
8 changed files with 13 additions and 73 deletions

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

View file

@ -124,15 +124,16 @@ def find_upper_bound(f, interval, **kwargs):
def find_upper_bound_vector(f, interval):
result = shgo(_negate(f), bounds=interval, options=dict(maxfev=100))
if not result.success:
raise RuntimeError("Could not find an upper bound.", result)
upper_bound = -result.fun + 0.1
upper_bound = -result.fun
return upper_bound
def sample_unweighted_vector(
f, interval, seed=None, upper_bound=None, report_efficiency=False, num=None
f, interval, seed=None, upper_bound=None, report_efficiency=False
):
dimension = len(interval)
interval = np.array([_process_interval(i) for i in interval])

View file

@ -230,62 +230,6 @@ nicer.
#+RESULTS:
** Test Driving
Now, let's try it out.
#+begin_src jupyter-python :exports both :results raw drawer
dist, x_limits = get_xs_distribution_with_pdf(
diff_xs, averaged_tchanel_q2, e_proton
)
#+end_src
#+RESULTS:
Let's plot it for some random values 😃.
#+begin_src jupyter-python :exports both :results raw drawer
fig, ax = set_up_plot()
pts = np.linspace(*interval_cosθ, 1000)
ax.plot(pts, [dist([cosθ, 0.3, 0.3]) for cosθ in pts])
#+end_src
#+RESULTS:
:RESULTS:
| <matplotlib.lines.Line2D | at | 0x7f0bd18b0700> |
[[file:./.ob-jupyter/4b7773815fe4943f422b5943cc67e48b7a75cc23.png]]
:END:
Having set both x to the same value, we get a symmetric distribution as expected.
Just the magnitude is a little startling! The value 1/3 is intentional!
Now we gonna take some samples!
But first we have to find an upper bound, which is expensive!
#+begin_src jupyter-python :exports both :results raw drawer
intervals = [interval_cosθ, [.01, 1], [.01, 1]]
upper_bound = monte_carlo.find_upper_bound_vector(dist, intervals)
upper_bound
#+end_src
#+RESULTS:
: 5721.40648474465
Beware!, this is darn slow, becaus the efficiency is soooo low.
#+begin_src jupyter-python :exports both :results raw drawer
sample_momenta(
100,
dist,
intervals,
e_proton,
upper_bound=upper_bound,
proc="auto",
cache="cache/pdf/samp_costh_test",
)[1]
#+end_src
#+RESULTS:
: 0.0004240744427426794
** Switching Horses: Sampling η
We set up a new distribution.
#+begin_src jupyter-python :exports both :results raw drawer
@ -308,8 +252,8 @@ Plotting it, we can see that the variance is reduced.
#+RESULTS:
:RESULTS:
| <matplotlib.lines.Line2D | at | 0x7f0bcf65abe0> |
[[file:./.ob-jupyter/5f2e010bf22bb8e157d5327258268cf8a465510d.png]]
| <matplotlib.lines.Line2D | at | 0x7f3574d07820> |
[[file:./.ob-jupyter/5597ca6056db11908cfca64c2090d67e3b94cc9e.png]]
:END:
Lets plot how the pdf looks.
@ -323,7 +267,7 @@ Lets plot how the pdf looks.
#+RESULTS:
:RESULTS:
| <matplotlib.lines.Line2D | at | 0x7f0bd1831100> |
| <matplotlib.lines.Line2D | at | 0x7f3572b7b8b0> |
[[file:./.ob-jupyter/b92f0c4b2c9f2195ae14444748fcdb7708d81c19.png]]
:END:
@ -332,27 +276,22 @@ Now we sample some events. Doing this in parallel helps. We let the os
figure out the cpu mapping.
#+begin_src jupyter-python :exports both :results raw drawer
intervals_η = [interval_η, [.01, 1], [.01, 1]]
intervals_η = [interval_η, [.1, 1], [.1, 1]]
result, eff = monte_carlo.sample_unweighted_array(
num_samples,
10000,
dist_η,
interval=intervals_η,
proc="auto",
report_efficiency=True,
cache="cache/pdf/huge",
#cache="cache/pdf/huge",
)
result
eff
#+end_src
#+RESULTS:
: array([[-1.43205911, 0.34762794, 0.01802916],
: [-2.13582434, 0.02851933, 0.07415878],
: [ 1.79962451, 0.01835771, 0.2725591 ],
: ...,
: [ 1.1782848 , 0.1113013 , 0.01554592],
: [-1.27942179, 0.12550189, 0.05389335],
: [ 2.22491947, 0.04076429, 0.13610809]])
: 0.003007891162077376
@ -369,6 +308,6 @@ Let's look at a histogramm of eta samples.
#+RESULTS:
:RESULTS:
| <Figure | size | 432x288 | with | 1 | Axes> | <matplotlib.axes._subplots.AxesSubplot | at | 0x7f0bcd5a1be0> |
[[file:./.ob-jupyter/b55eed7143c116ac0471a510bd62174a42e1ac31.png]]
| <Figure | size | 432x288 | with | 1 | Axes> | <matplotlib.axes._subplots.AxesSubplot | at | 0x7f35728b26a0> |
[[file:./.ob-jupyter/ec474fc3576110c487c7fb31403cbab0a063efa9.png]]
:END: