In a neural network, "bias" refers to a constant value added to the weighted sum of inputs before passing it through the activation function, essentially allowing the network to shift the decision boundary and learn more complex patterns in the data, while "without bias" means the network would only consider the weighted sum of inputs, limiting its ability to model real-world scenarios where data might not center around the origin;

Weights set the standards for the neuron's signal strength. This value will determine the influence input data has on the output product.
The weights in the network are initialized to small random numbers ranging for example from -1.0 to 1.0, or -0.5 to 0.5.