Skip to content

Comments

PINN#9

Open
Daylesss wants to merge 32 commits intoKrekep:devfrom
Daylesss:PINN_to_dev
Open

PINN#9
Daylesss wants to merge 32 commits intoKrekep:devfrom
Daylesss:PINN_to_dev

Conversation

@Daylesss
Copy link

added a class implementing physically informed neural networks. The class works with IModel.

Copy link
Owner

@Krekep Krekep left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

В целом неплохо

Comment on lines 15 to 18
input_size: int = 1,
block_size: Optional[list] = None,
output_size: int = 1,
# TODO: config
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Перепишите с использованием конфига (как в #13)

Comment on lines 5 to 7
LossFunc: TypeAlias = Callable[
[tf.keras.Model, tf.GradientTape, tf.Tensor, tf.Tensor], float | int
] No newline at end of file
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Мне больше нравится

class PhysicLoss:
  def __call__(self, model: tf.keras.Model, tape: tf.GradientTape, x: tf.Tensor) -> float | int:
    tape.watch(x)
    u = model(x)

Тогда можно использовать следующим образом

class SinLoss_phys(PhysicLoss):
  """ y'' + 100y = 0, y(0) = 0, y'(0) = 10 """
  def __call__(self, model: tf.keras.Model, tape: tf.GradientTape, x: tf.Tensor) -> float | int:
    super().__call__(model, tape, x)
    
    u_x = tape.gradient(u, x)
    u_xx = tape.gradient(u_x, x)
    
    u_left = u_xx + 100 * u
    u_right = 0
    return mse(u_right - u_left)

class SinLoss_boundary(PhysicLoss):
  """ y'' + 100y = 0, y(0) = 0, y'(0) = 10 """
  def __call__(self, model: tf.keras.Model, tape: tf.GradientTape, x: tf.Tensor) -> float | int:
    x = 0
    super().__call__(model, tape, x)
    
    u_left = u
    u_right = 0
    return mse(u_right - u_left)
    

class SinLoss_boundary2(PhysicLoss):
  """ y'' + 100y = 0, y(0) = 0, y'(0) = 10 """
  def __call__(self, model: tf.keras.Model, tape: tf.GradientTape, x: tf.Tensor) -> float | int:
    x = 0
    super().__call__(model, tape, x)
    
    u_x = tape.gradient(u, x)
    u_left = u_x
    u_right = 10
    return mse(u_right - u_left)

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Или тут есть какие-то минусы, которые я не вижу?

Comment on lines 31 to 36
self._name = "PINN"
if block_size is None:
block_size = []

decorator_params: List[Optional[Dict]] = [None]
if "decorator_params" in kwargs.keys():
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Это всё здесь не надо. По идее Вам достаточно сказать

self.network = network

# than we could use network like this
self.network.to_cpp()
self.network.call()
self.network.compile()
etc.

# TODO: ignore the Tensorflow types here,
# because we pass exactly the specified parameters
# to degann.IModel.train in fit().
def train_step(self, data: tuple[tf.Tensor, tf.Tensor]): # type: ignore
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Вам скорее надо написать кастомную функцию обучения, т.к. fit требует передавать в себя данные, а по базе Вы можете обучать pinn без данных.
Опять же, Вам надо считать значение уравнения в каких-то точках, т.е. как будто надо x передавать. А можно создавать его каждый раз разным при обучении. Ещё можно сделать его аттрибутом нейронки

if loss is None:
loss = tf.constant(0)
# TODO: refactor
phys_loss = tf.constant(0)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Рекомендую просто сделать список self.phys_losses и тогда циклом по ним проходится. У них всё равно одинаковая сигнатура

Best neural network presented as a dictionary
"""
best_net = None
# TODO: may be float("inf")?
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+

Comment on lines 16 to 18
LossFunc: TypeAlias = Callable[
[tf.keras.Model, tf.GradientTape, tf.Tensor, tf.Tensor], float | int
]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Дублирование кода

Comment on lines 33 to 34
class DenseParams(BaseNetParams):
pass
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Не совсем понимаю, у DenseParams есть же параметры

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants