|
7 | 7 | "source": [
|
8 | 8 | "# Load Collectives and Load Histograms\n",
|
9 | 9 | "\n",
|
10 |
| - "From the load (stress) side pyLife provides the classes `LoadCollective` and `LoadHistogram` to deal with load collectives. `LoadCollective` contains individal hysteresis loops whereas `LoadHistogram` contains a 2D-histogram of classes of hysteresis loops and the number of cycles with which they occur." |
| 10 | + "From the load (stress) side pyLife provides the classes [`LoadCollective`](https://pylife.readthedocs.io/en/stable/stress/load_collective.html) and [`LoadHistogram`](https://pylife.readthedocs.io/en/stable/stress/load_histogram.html) to deal with load collectives. `LoadCollective` contains individal hysteresis loops whereas `LoadHistogram` contains a 2D-histogram of classes of hysteresis loops and the number of cycles with which they occur." |
11 | 11 | ]
|
12 | 12 | },
|
13 | 13 | {
|
|
93 | 93 | "id": "31b400a3",
|
94 | 94 | "metadata": {},
|
95 | 95 | "source": [
|
96 |
| - "As you can see, the rainflow analysis found five histresis loops, three from 1.0 to -1.0 and two from -2.0 to 2.0. Alternatively you can ask the recorder for a load histogram:" |
| 96 | + "As you can see, the rainflow analysis found five hystresis loops, three from 1.0 to -1.0 and two from -2.0 to 2.0. Alternatively you can ask the recorder for a load histogram:" |
97 | 97 | ]
|
98 | 98 | },
|
99 | 99 | {
|
|
293 | 293 | "plt.pcolormesh(X, Y, numpy_hist)"
|
294 | 294 | ]
|
295 | 295 | },
|
| 296 | + { |
| 297 | + "cell_type": "markdown", |
| 298 | + "id": "cf99a42b-3075-42e5-87c3-2d99ab55a681", |
| 299 | + "metadata": {}, |
| 300 | + "source": [ |
| 301 | + "In order to use this load histogram for a damage calculation we can perform a mean stress transformation to transorm all the hysteresis loops to one given R-value." |
| 302 | + ] |
| 303 | + }, |
| 304 | + { |
| 305 | + "cell_type": "code", |
| 306 | + "execution_count": null, |
| 307 | + "id": "a0d06dfd-2dd9-4689-b230-67c102220e31", |
| 308 | + "metadata": {}, |
| 309 | + "outputs": [], |
| 310 | + "source": [ |
| 311 | + "meanstress_sensitivity = pd.Series({\n", |
| 312 | + " 'M': 0.3,\n", |
| 313 | + " 'M2': 0.2\n", |
| 314 | + "})\n", |
| 315 | + "\n", |
| 316 | + "transformed_histogram = histogram.meanstress_transform.fkm_goodman(meanstress_sensitivity, R_goal=-1)\n", |
| 317 | + "transformed_histogram.to_pandas()" |
| 318 | + ] |
| 319 | + }, |
296 | 320 | {
|
297 | 321 | "cell_type": "markdown",
|
298 | 322 | "id": "0f505f1a",
|
299 | 323 | "metadata": {},
|
300 | 324 | "source": [
|
301 |
| - "We can also plot the cumulated version of the histogram. Therefor we put the amplitude and the cycles into a dataframe." |
| 325 | + "We can also plot the cumulated version of the histogram. Therefore we put the amplitude and the cycles into a dataframe." |
302 | 326 | ]
|
303 | 327 | },
|
304 | 328 | {
|
|
309 | 333 | "outputs": [],
|
310 | 334 | "source": [
|
311 | 335 | "df = pd.DataFrame({\n",
|
312 |
| - " 'cycles': histogram.load_collective.cycles, \n", |
313 |
| - " 'amplitude': histogram.load_collective.amplitude, \n", |
| 336 | + " 'cycles': transformed_histogram.cycles, \n", |
| 337 | + " 'amplitude': transformed_histogram.amplitude, \n", |
314 | 338 | "}).sort_values('amplitude', ascending=False)"
|
315 | 339 | ]
|
316 | 340 | },
|
|
332 | 356 | "plt.plot(np.cumsum(df.cycles), df.amplitude)\n",
|
333 | 357 | "plt.loglog()"
|
334 | 358 | ]
|
| 359 | + }, |
| 360 | + { |
| 361 | + "cell_type": "markdown", |
| 362 | + "id": "7d3e056d-7579-4fca-bc40-5687f3779eb1", |
| 363 | + "metadata": {}, |
| 364 | + "source": [ |
| 365 | + " " |
| 366 | + ] |
335 | 367 | }
|
336 | 368 | ],
|
337 | 369 | "metadata": {
|
|
349 | 381 | "mimetype": "text/x-python",
|
350 | 382 | "name": "python",
|
351 | 383 | "nbconvert_exporter": "python",
|
352 |
| - "pygments_lexer": "ipython3" |
| 384 | + "pygments_lexer": "ipython3", |
| 385 | + "version": "3.12.4" |
353 | 386 | }
|
354 | 387 | },
|
355 | 388 | "nbformat": 4,
|
|
0 commit comments