Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into fix_histogram_logge…
Browse files Browse the repository at this point in the history
…r_interface
  • Loading branch information
nomadbl committed Jul 25, 2023
2 parents e129022 + a79d8bd commit 85de35a
Show file tree
Hide file tree
Showing 20 changed files with 511 additions and 21 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ test/test_logs
docs/Manifest.toml

gen/proto
gen/protojl
gen/protojl
7 changes: 5 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "TensorBoardLogger"
uuid = "899adc3e-224a-11e9-021f-63837185c80f"
authors = ["Filippo Vicentini <filippovicentini@gmail.com>"]
version = "0.1.20"
version = "0.1.22"

[deps]
CRC32c = "8bf52ea8-c179-5cab-976a-9e18b702a9bc"
Expand All @@ -20,11 +20,14 @@ StatsBase = "0.27, 0.28, 0.29, 0.30, 0.31, 0.32, 0.33, 0.34"
julia = "1.6"

[extras]
Minio = "4281f0d9-7ae0-406e-9172-b7277c1efa20"
Cairo = "159f3aea-2a34-519c-b102-8c37f9878175"
Fontconfig = "186bb1d3-e1f7-5a2c-a377-96d770f13627"
Gadfly = "c91e804a-d5a3-530f-b6f0-dfbca275c004"
ImageMagick = "6218d12a-5da1-5696-b52f-db25d2ecc6d1"
LightGraphs = "093fc24a-ae57-5d10-9952-331d41423f4d"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
MLDatasets = "eb30cadb-4394-5ae3-aed4-317e484a6458"
Minio = "4281f0d9-7ae0-406e-9172-b7277c1efa20"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
PyPlot = "d330b81b-6aea-500a-939a-2ce795aea3ee"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,8 @@ logger in Julia:

You can log to TensorBoard any type. Numeric types will be logged as scalar,
arrays will be binned into histograms, images and audio will be logged as such,
and we even support [Plots](https://github.com/JuliaPlots/Plots.jl) and
[PyPlot](https://github.com/JuliaPlots/Plots.jl) figures!
and we even support [Plots](https://github.com/JuliaPlots/Plots.jl),
[PyPlot](https://github.com/JuliaPlots/Plots.jl) and [Gadfly](https://github.com/GiovineItalia/Gadfly.jl) figures!

For details about how types are logged by default, or how to customize this behaviour for your custom types,
refer to the documentation or the examples folder.
Expand Down Expand Up @@ -71,7 +71,7 @@ end
```

## Integration with third party packages
We also support native logging of the types defined by a few third-party packages, such as `Plots` and `PyPlot` plots.
We also support native logging of the types defined by a few third-party packages, such as `Plots`, `PyPlot` and `Gadfly` plots.
If there are other libraries that you think we should include in the list, please open an issue.

## Roadmap
Expand Down
4 changes: 3 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,13 @@ makedocs(
"Backends" => "custom_behaviour.md",
"Reading back data" => "deserialization.md",
"Extending" => "extending_behaviour.md",
"Explicit Interface" => "explicit_interface.md"
"Explicit Interface" => "explicit_interface.md",
"Hyperparameter logging" => "hyperparameters.md"
],
"Examples" => Any[
"Flux.jl" => "examples/flux.md"
"Optim.jl" => "examples/optim.md"
"Hyperparameter tuning" => "examples/hyperparameter_tuning.md"
]
],
format = Documenter.HTML(
Expand Down
53 changes: 53 additions & 0 deletions docs/src/examples/hyperparameter_tuning.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# Hyperparameter tuning

We will start this example by setting up a simple random walk experiment, and seeing the effect of the hyperparameter `bias` on the results.

First, import the packages we will need with:
```julia
using TensorBoardLogger, Logging
using Random
```
Next, we will create a function which runs the experiment and logs the results, include the hyperparameters stored in the `config` dictionary.
```julia
function run_experiment(id, config)
logger = TBLogger("random_walk/run$id", tb_append)

# Specify all the metrics we want to track in a list
metric_names = ["scalar/position"]
write_hparams!(logger, config, metric_names)

epochs = config["epochs"]
sigma = config["sigma"]
bias = config["bias"]
with_logger(logger) do
x = 0.0
for i in 1:epochs
x += sigma * randn() + bias
@info "scalar" position = x
end
end
nothing
end
```
Now we can write a script which runs an experiment over a set of parameter values.
```julia
id = 0
for bias in LinRange(-0.1, 0.1, 11)
for epochs in [50, 100]
config = Dict(
"bias"=>bias,
"epochs"=>epochs,
"sigma"=>0.1
)
run_experiment(id, config)
id += 1
end
end
```

Below is an example of the dashboard you get when you open Tensorboard with the command:
```sh
tensorboard --logdir=random_walk
```

![tuning plot](tuning.png)
Binary file added docs/src/examples/tuning.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
10 changes: 10 additions & 0 deletions docs/src/hyperparameters.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Hyperparameter logging

In additition to logging the experiments, you may wish to also visualise the effect of hyperparameters on some plotted metrics. This can be done by logging the hyperparameters via the `write_hparams!` function, which takes a dictionary mapping hyperparameter names to their values (currently limited to `Real`, `Bool` or `String` types), along with the names of any metrics that you want to view the effects of.

You can see how the HParams dashboard in Tensorboard can be used to tune hyperparameters on the [tensorboard website](https://www.tensorflow.org/tensorboard/hyperparameter_tuning_with_hparams).

## API
```@docs
write_hparams!
```
11 changes: 7 additions & 4 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,12 +111,15 @@ at [Reading back TensorBoard data](@ref)
We also support logging custom types from a the following third-party libraries:
- [Plots.jl](https://github.com/JuliaPlots/Plots.jl): the `Plots.Plot` type will be rendered to PNG at the resolution specified by the object and logged as an image
- [PyPlot.jl](https://github.com/JuliaPy/PyPlot.jl): the `PyPlot.Figure` type will be rendered to PNG at the resolution specified by the object and logged as an image
- [Gadfly.jl](https://github.com/GiovineItalia/Gadfly.jl): the `Gadfly.Plot` type will be rendered to PNG at the resolution specified by the object and logged as an image
- [Gadfly.jl](https://github.com/GiovineItalia/Gadfly.jl): the `Gadfly.Plot` type will be rendered to PNG at the resolution specified by the object and logged as an image. `Cairo` and `Fontconfig` packages must be imported for this functionality to work as it is required by `Gadfly`.
- [Tracker.jl](https://github.com/FluxML/Tracker.jl): the `TrackedReal` and `TrackedArray` types will be logged as vector data
- [ValueHistories.jl](https://github.com/JuliaML/ValueHistories.jl): the `MVHistory` type is used to store the deserialized content of .proto files.

## Explicit logging

In alternative, you can also log data to TensorBoard through its functional interface,
by calling the relevant method with a tag string and the data. For information
on this interface refer to [Explicit interface](@ref)...
As an alternative, you can also log data to TensorBoard through its functional interface, by calling the relevant method with a tag string and the data. For information on this interface refer to [Explicit interface](@ref).

## Hyperparameter tuning

Many experiments rely on hyperparameters, which can be difficult to tune. Tensorboard allows you to visualise the effect of your hyperparameters on your metrics, giving you an intuition for the correct hyperparameters for your task. For information on this API, see the [Hyperparameter logging](@ref) manual page.

14 changes: 14 additions & 0 deletions examples/Gadfly.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
using TensorBoardLogger #import the TensorBoardLogger package
using Logging #import Logging package
using Gadfly, Cairo, Fontconfig

logger = TBLogger("Gadflylogs", tb_append) #create tensorboard logger

################log scalars example: y = x²################
#using logger interface
x = rand(100)
y = rand(100)
p = plot(x=x, y=y, Geom.point);
with_logger(logger) do
@info "gadfly" plot=p
end
38 changes: 38 additions & 0 deletions examples/HParams.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
using TensorBoardLogger #import the TensorBoardLogger package
using Logging #import Logging package
using Random # Exports randn

# Run 10 experiments to see a plot
for j in 1:10
logger = TBLogger("random_walks/run$j", tb_append)

sigma = 0.1
epochs = 200
bias = (rand()*2 - 1) / 10 # create a random bias
use_seed = false
# Add in the a dummy loss metric
with_logger(logger) do
x = 0.0
for i in 1:epochs
x += sigma * randn() + bias
@info "scalar" loss = x
end
end

# Hyperparameter is a dictionary of parameter names to their values. This
# supports numerical types, bools and strings. Non-bool numerical types
# are converted to Float64 to be displayed.
hparams_config = Dict{String, Any}(
"sigma"=>sigma,
"epochs"=>epochs,
"bias"=>bias,
"use_seed"=>use_seed,
"method"=>"MC"
)
# Specify a list of tags that you want to show up in the hyperparameter
# comparison
metrics = ["scalar/loss"]

# Write the hyperparameters and metrics config to the logger.
write_hparams!(logger, hparams_config, metrics)
end
25 changes: 25 additions & 0 deletions examples/Scalars.jl
Original file line number Diff line number Diff line change
Expand Up @@ -23,3 +23,28 @@ with_logger(logger) do
@info "scalar/complex" y = z
end
end


################control step increments with context################
with_logger(logger) do
for epoch in 1:10
for i=1:100
# increments global_step by default
with_TBLogger_hold_step() do
# all of these are logged at the same global_step
# and the logger global_step is only then increased
@info "train1/scalar" val=i
@info "train2/scalar" val2=i/2
@info "train3/scalar" val3=100-i
end
end
# step increment at end can be disabled for easy train/test sync
with_TBLogger_hold_step(;step_at_end=false) do
# all of these are logged at the same global_step
# and the logger global_step is only then increased
@info "test1/scalar" epoch=epoch
@info "test2/scalar" epoch2=epoch^2
@info "test3/scalar" epoch3=epoch^3
end
end
end
2 changes: 1 addition & 1 deletion gen/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@ FilePathsBase = "48062228-2e41-5def-b9a4-89aafe57970f"
Glob = "c27321d9-0574-5035-807b-f59d2c89b15c"
ProtoBuf = "3349acd9-ac6a-5e09-bcdb-63829b23a429"

[comapt]
[compat]
ProtoBuf = "0.9.1"
15 changes: 15 additions & 0 deletions src/Optional/Gadfly.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import .Gadfly: Plot, render, draw
function Base.convert(t::Type{PngImage}, plot::Gadfly.Plot)
pb = PipeBuffer();
show(pb, MIME("image/png"), render(plot));
# draw(Gadfly.PNG(pb), plot); # leaving here for now, does same thing
return PngImage(pb)
end

preprocess(name, plot::Gadfly.Plot, data) = preprocess(name, convert(PngImage, plot), data)
preprocess(name, plots::AbstractArray{<:Gadfly.Plot}, data) = begin
for (i, plot)=enumerate(plots)
preprocess(name*"/$i", plot, data)
end
return data
end
53 changes: 52 additions & 1 deletion src/TBLogger.jl
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,8 @@ export tb_append, tb_overwrite, tb_increment
Creates a TensorBoardLogger in the folder `logdir`. The second (optional)
argument specifies the behaviour if the `logdir` already exhists: the default
choice `tb_increment` appends an increasing number 1,2... to `logdir`. Other
choices are `tb_overwrite`, which overwrites the previous folder, and `tb_append`.
choices are `tb_overwrite`, which overwrites the previous folder, and `tb_append`,
which adds to any existing logs.
Optional keyword argument `prefix` can be passed to prepend a path to the file
name (note, not the log directory). See `create_eventfile()`
Expand Down Expand Up @@ -298,3 +299,53 @@ Base.show(io::IO, mime::MIME"text/plain", tbl::TBLogger) = begin
"""
Base.print(io, str)
end

"""
`with_TBLogger_hold_step(f, [step]; step_at_end::Bool=true)`
Context function to ease control of logging steps and synchronization.
Amount of step increment can be controlled via `set_step_increment!``.
Example:
```julia
with_logger(lg) do
for epoch in 1:10
for i=1:100
# increments global_step by default
with_TBLogger_hold_step() do
# all of these are logged at the same global_step
# and the logger global_step is only then increased
@info "train1/scalar" i=i
@info "train2/scalar" i2=i^2
@info "train3/scalar" i3=i^3
end
end
# step increment at end can be disabled for easy train/test sync
with_TBLogger_hold_step(;step_at_end=false) do
# all of these are logged at the same global_step
# and the logger global_step is only then increased
@info "test1/scalar" i=i
@info "test2/scalar" i2=i^2
@info "test3/scalar" i3=i^3
end
end
end
```
"""
function with_TBLogger_hold_step(f, step::Int; step_at_end::Bool=true)
logger = CoreLogging.current_logger()
@assert logger isa TBLogger "with_TBLogger_hold_step: current logger is not a TBLogger, cannot establish current step automatically"
curr_step = logger.global_step
curr_increment = logger.step_increment
set_step!(logger, step)
set_step_increment!(logger, 0)
f()
set_step!(logger, curr_step)
set_step_increment!(logger, curr_increment)
step_at_end && increment_step!(logger, curr_increment)
end
function with_TBLogger_hold_step(f; step_at_end::Bool=true)
logger = CoreLogging.current_logger()
isa(logger, TBLogger) || error("with_TBLogger_hold_step: current logger is not a TBLogger, cannot establish current step automatically")
with_TBLogger_hold_step(f, logger.global_step; step_at_end=step_at_end)
end
15 changes: 13 additions & 2 deletions src/TensorBoardLogger.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ using Base.CoreLogging: CoreLogging, AbstractLogger, LogLevel, Info,
handle_message, shouldlog, min_enabled_level, catch_exceptions, with_logger,
NullLogger

export TBLogger, reset!, set_step!, increment_step!, set_step_increment!
export TBLogger, reset!, set_step!, increment_step!, set_step_increment!, with_TBLogger_hold_step
export log_histogram, log_value, log_vector, log_text, log_image, log_images,
log_audio, log_audios, log_graph, log_embeddings, log_custom_scalar
log_audio, log_audios, log_graph, log_embeddings, log_custom_scalar, write_hparams!
export map_summaries, TBReader

export ImageFormat, L, CL, LC, LN, NL, NCL, NLC, CLN, LCN, HW, WH, HWC, WHC,
Expand Down Expand Up @@ -62,6 +62,7 @@ include("ImageFormat.jl")
const TB_PLUGIN_JLARRAY_NAME = "_jl_tbl_array_sz"

include("TBLogger.jl")
include("hparams.jl")
include("utils.jl") # CRC Utils
include("event.jl")
include("Loggers/base.jl")
Expand Down Expand Up @@ -102,6 +103,16 @@ function __init__()
@require PyPlot="d330b81b-6aea-500a-939a-2ce795aea3ee" begin
include("Optional/PyPlot.jl")
end
@require Gadfly="c91e804a-d5a3-530f-b6f0-dfbca275c004" begin
@require Fontconfig="186bb1d3-e1f7-5a2c-a377-96d770f13627" begin
@require Cairo="159f3aea-2a34-519c-b102-8c37f9878175" begin
using .Cairo
using .Fontconfig
include("Optional/Gadfly.jl")
end
end
end
# @require Gadfly="c91e804a-d5a3-530f-b6f0-dfbca275c004" include("Optional/Gadfly.jl")
@require Tracker="9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c" begin
include("Optional/Tracker.jl")
end
Expand Down
9 changes: 4 additions & 5 deletions src/event.jl
Original file line number Diff line number Diff line change
Expand Up @@ -29,17 +29,16 @@ format. The format follows the following rule (in bytes)
#4 N..N+8 UInt32 masked_CRC of #3
"""
function write_event(out::IO, event::Event)
data = PipeBuffer();
encode(ProtoEncoder(data), event)
event_bytes = serialize_proto(event)

#header
header = collect(reinterpret(UInt8, [data.size]))
header = collect(reinterpret(UInt8, [length(event_bytes)]))
crc_header = reinterpret(UInt8, UInt32[masked_crc32c(header)])
crc_data = reinterpret(UInt8, UInt32[masked_crc32c(data.data)])
crc_data = reinterpret(UInt8, UInt32[masked_crc32c(event_bytes)])

write(out, header)
write(out, crc_header)
write(out, data.data)
write(out, event_bytes)
write(out, crc_data)
flush(out)
end
Expand Down
Loading

0 comments on commit 85de35a

Please sign in to comment.