INTERACT FORUM

More => Old Versions => JRiver Media Center 27 for Windows => Topic started by: Wiresetc on October 04, 2021, 02:59:20 pm

Title: Creating a library list in Excel - Limit on copied rows?
Post by: Wiresetc on October 04, 2021, 02:59:20 pm
Hello all,

First an introduction:
For decades of listening to music, my personal library has grown quite large. It is about 1.3TB and >150,000 files.
To backup this amount, a lot of time and spare space is needed.
For this reason, I prefer to have "mini-backups" on portable devices such as tablets and smartphones.
One of the most frequently used mini-backups is a simple Excel file (.xlsx) which contains name, artist and album information of all files.

Until now, I have manually copied the data from JRiver to Excel. With this, I mean selecting and copy-pasting with Ctrl+C and Ctrl+V.
For unknown reasons, using select-all (Ctrl+A) is not possible.
Copying 150,000 rows might be too much data.
I noticed around 10,000~20,000 rows the copy fails, as the data will not appear in Excel.
Copying in 7~15 parts is a solution here.

I searched for different ways to export the library to Excel.
The best way was to create a playlist, import all music and export this playlist to a CSV file.
Unfortunately, when opening this CSV file in Excel, Japanese characters turn into boxes and dots. Example: ・ク・・」ョ・クュ・・・・
In short, this method was unusable.

Now, for the questions:
1. Is anyone familiar with a possible limit on the amount of rows that can be copied from the JRiver library to Excel at once?
2. Is there another way to export library data to Excel and create a "mini-backup"?

Thank you for your time.
Title: Re: Creating a library list in Excel - Limit on copied rows?
Post by: Matt on October 04, 2021, 03:14:13 pm
Next build:
Changed: Copies are limited to 256 MB instead of 16 MB.
Title: Re: Creating a library list in Excel - Limit on copied rows?
Post by: Wiresetc on October 04, 2021, 04:41:49 pm
Copies are limited by data size. That answers the first question. Thank you.
Also good to know the next build will be able to handle larger data copies.