Python Forum
Using Autoencoder for Data Augmentation of numerical Dataset in Python
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Using Autoencoder for Data Augmentation of numerical Dataset in Python
#3
(Jul-10-2020, 06:47 AM)hussainmujtaba Wrote: You should use the loss function 'sparse_categorical_crossentropy' instead of 'binary cross-entropy' as MNIST has more categories than 2.
For a guide, you can take look at this article about auto-encoders


Hey, thanks. I will take a look at the article.

But i am not using the MNIST Dataset. I am using my own numerical Dataset from a CSV file. And that has only one class. Actually it has no label at all. As far as i know Autoencoders don't need that because they just encode and decode the data.
But yeah as i said i don't realy know where to start to make it run. And in the example they use the binary_crossentropy aswell.
Reply


Messages In This Thread
RE: Using Autoencoder for Data Augmentation of numerical Dataset in Python - by Marvin93 - Jul-10-2020, 07:18 PM

Possibly Related Threads…
Thread Author Replies Views Last Post
  Help with Scipy optimize for numerical problem Jbjbjb1 0 1,566 Jun-22-2021, 05:03 AM
Last Post: Jbjbjb1
  How to save predictions made by an autoencoder Glasgow1988 0 1,575 Jul-03-2020, 12:43 PM
Last Post: Glasgow1988
  Partitioning when splitting data into train and test-dataset Den0st 0 1,989 Dec-07-2019, 08:31 PM
Last Post: Den0st
  Pandas Import CSV count between numerical values within 1 Column ptaylor520 3 2,689 Jul-16-2019, 08:13 AM
Last Post: ptaylor520
  stacked autoencoder training JohnMarie 0 2,658 Feb-24-2019, 12:23 AM
Last Post: JohnMarie
  How to use a tfrecord file for training an autoencoder JohnMarie 6 4,663 Feb-22-2019, 06:35 PM
Last Post: JohnMarie
  Python programming and a dataset ErnestTBass 9 5,057 Feb-05-2019, 06:51 PM
Last Post: buran
  Read CSV data into Pandas DataSet From Variable? Oliver 7 13,998 Jul-05-2018, 03:29 AM
Last Post: answerquest

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020