Closed ghost closed 1 year ago
@source-transformer ,
Thanks for the report. Here is a fix
package main
import (
"fmt"
"log"
"github.com/sugarme/gotch"
"github.com/sugarme/gotch/nn"
"github.com/sugarme/gotch/ts"
)
const (
LearningRate float64 = 0.01
HiddenSize int64 = 256
SeqLen int64 = 180
BatchSize int64 = 256
Epochs int = 3
SamplingLen int64 = 1024
)
func sample(data *ts.TextData, lstm *nn.LSTM, linear *nn.Linear, device gotch.Device) string {
labels := data.Labels()
inState := lstm.ZeroState(1)
lastLabel := int64(0)
var runes []rune
for i := 0; i < int(SamplingLen); i++ {
input := ts.MustZeros([]int64{1, labels}, gotch.Float, device)
// NOTE. `Narrow` creates tensor that shares same storage
inputView := input.MustNarrow(1, lastLabel, 1, false)
inputView.MustFill_(ts.FloatScalar(1.0))
state := lstm.Step(input, inState)
// 1. Delete inState tensors (from C land memory)
inState.(*nn.LSTMState).Tensor1.MustDrop()
inState.(*nn.LSTMState).Tensor2.MustDrop()
// 2. Then update with current state
inState = state
// 3. Delete intermediate tensors
input.MustDrop()
inputView.MustDrop()
forwardTs := linear.Forward(state.(*nn.LSTMState).H()).MustSqueezeDim(0, true).MustSoftmax(-1, gotch.Float, true)
sampledY := forwardTs.MustMultinomial(1, false, true)
lastLabel = sampledY.Int64Values()[0]
sampledY.MustDrop()
char := data.LabelForChar(lastLabel)
runes = append(runes, char)
}
// Delete the last state
inState.(*nn.LSTMState).Tensor1.MustDrop()
inState.(*nn.LSTMState).Tensor2.MustDrop()
return string(runes)
}
func main() {
device := gotch.CudaIfAvailable()
vs := nn.NewVarStore(device)
data, err := ts.NewTextData("../../data/char-rnn/input.txt")
if err != nil {
panic(err)
}
labels := data.Labels()
fmt.Printf("Dataset loaded, %v labels\n", labels)
lstm := nn.NewLSTM(vs.Root(), labels, HiddenSize, nn.DefaultRNNConfig())
linear := nn.NewLinear(vs.Root(), HiddenSize, labels, nn.DefaultLinearConfig())
optConfig := nn.DefaultAdamConfig()
opt, err := optConfig.Build(vs, LearningRate)
if err != nil {
log.Fatal(err)
}
for epoch := 1; epoch <= Epochs; epoch++ {
sumLoss := 0.0
cntLoss := 0.0
dataIter := data.IterShuffle(SeqLen+1, BatchSize)
batchCount := 0
for {
batchTs, ok := dataIter.Next()
if !ok {
break
}
batchNarrow := batchTs.MustNarrow(1, 0, SeqLen, false)
xsOnehot := batchNarrow.Onehot(labels).MustTo(device, true) // [256, 180, 65]
batchNarrow.MustDrop()
ys := batchTs.MustNarrow(1, 1, SeqLen, true).MustTotype(gotch.Int64, true).MustTo(device, true).MustView([]int64{BatchSize * SeqLen}, true)
lstmOut, outState := lstm.Seq(xsOnehot)
// NOTE. Although outState will not be used. There a hidden memory usage
// on C land memory that is needed to free up. Don't use `_`
outState.(*nn.LSTMState).Tensor1.MustDrop()
outState.(*nn.LSTMState).Tensor2.MustDrop()
xsOnehot.MustDrop()
logits := linear.Forward(lstmOut)
lstmOut.MustDrop()
lossView := logits.MustView([]int64{BatchSize * SeqLen, labels}, true)
loss := lossView.CrossEntropyForLogits(ys)
ys.MustDrop()
lossView.MustDrop()
opt.BackwardStepClip(loss, 0.5)
sumLoss += loss.Float64Values()[0]
cntLoss += 1.0
loss.MustDrop()
batchCount++
if batchCount%500 == 0 {
fmt.Printf("Epoch %v - Batch %v \n", epoch, batchCount)
}
} // infinite for-loop
sampleStr := sample(data, lstm, linear, device)
fmt.Printf("Epoch %v - Loss: %v \n", epoch, sumLoss/cntLoss)
fmt.Println(sampleStr)
dataIter.Data.MustDrop()
dataIter.Indexes.MustDrop()
}
}
I tried with 10 epochs and here is the output:
Dataset loaded, 65 labels
Epoch 1 - Batch 500
Epoch 1 - Batch 1000
Epoch 1 - Batch 1500
Epoch 1 - Batch 2000
Epoch 1 - Batch 2500
Epoch 1 - Batch 3000
Epoch 1 - Batch 3500
Epoch 1 - Batch 4000
Epoch 1 - Loss: 1.1906567276126208
lere's no storm, for whose of his bloody
To work gail should inar it to emposed the gill,
His Tog is first he lost at much, indeed
With apless shall seem base that I pray,
And some son any tormed borget, resince
And leave the child of all thy harms; and, sir.
PROSPERO:
'Tis women along:
So ear no more shall be gone alone,
That I seem for our honour cannot year in
ourself, contravies, answer you open my purse.
AUFIDIUS:
I dare not live,
Than so but smook; helry old man sir Receive seljom,
That shows your fellows lies to be your soarable thing:
'New, lord, you had not sanctuom into a last:
One that of such feedings for hurds
As Capullion, a mislike clamoos,
Upon his clamour in the fun-good deliver,
And without me would let him; come to me,
Art thou a saint, for charge than thou, or murder
Than your best friends abboricested.
BUCKINGHAM:
Kingle sheine!'
EXETER:
I shall ne'er keep it gold, then maidness play.
First Murderer:
This, would be in the slave of their lips conceive
To white your screech a digged, t
Epoch 2 - Batch 500
Epoch 2 - Batch 1000
Epoch 2 - Batch 1500
Epoch 2 - Batch 2000
Epoch 2 - Batch 2500
Epoch 2 - Batch 3000
Epoch 2 - Batch 3500
Epoch 2 - Batch 4000
Epoch 2 - Loss: 1.0380784590737988
or one flight dog thou there he's dead.
Follow my pack, let not my father mad!
OXFORD:
Findness, the bloaght smooth a lady glad,
But to be fighters Kate, I'll ut, thou calless.
LUCENTIO:
What's the autuard?
DUKE VINCENTIO:
Dismiss it must make their delight to the
prison; lords are we from these grave, the west before,
And with the tongues that I have too fallen?
Why, so we all of your title lords,
Stains feels.
LADY CAPULET:
What, die they will be to do
Asthom out that which your own marriage to me
Nom, inhaby her only senten or hands.
LADY CAPULET:
What is the day of Naples are?
MIRANDA:
O, yessen it, Bolingbrods well it,'
And here lies of boot at a Capulet
Withel is now seen to thee: marry, as I
gave him in a gods came from Kate, sir,
Hight be its brothers call me die with such prison.
FLORZALUS:
By Juliet, father, you may not.
Third Cither:
You are the butcherable bane.
KATHARINA:
The queen is very Wate, is kiss to blame these court
Affertain that I in remain.
ROMEO:
What now task't by the meane
Epoch 3 - Batch 500
Epoch 3 - Batch 1000
Epoch 3 - Batch 1500
Epoch 3 - Batch 2000
Epoch 3 - Batch 2500
Epoch 3 - Batch 3000
Epoch 3 - Batch 3500
Epoch 3 - Batch 4000
Epoch 3 - Loss: 1.0212256436067726
erse ears it straight:
To make this ben till Waith'st hook of death;
My gensle Northwndought that got A black and
straight view enfirdly in arms,
Between true; if this rawerisang of me,
Is thy fatal royel heir hasteming very
Threllowers come to any which could never perforce:
I have thou reports as I.
Widow:
The people hath amove us tear to make your appetital;
And so not little loss the world--
His mother, Rivoridio, I'll not be,
Light on me grow libertly of venomenot
Cominius, respects that was's the swoolect.
MARIANA:
What, are you so?
Did Juce is this lamentable she hath writ enough,
Subject as it is, on whom more: he mayster than
A thing I sent away; those sick before us,
Nor more than he to sit afording.
CLAUDIO:
Nay, but I will make for poor ease of your
great day-fools; and his money Mawfur, madam, I revels me with him:
Then, in devility then live underjoint.
BUSHY:
Fair queen, what a mast begin for which you have
To heavant homely two men's royarioly.
BUCKINGHAM:
Well, even with him: yet my unna
Epoch 4 - Batch 500
Epoch 4 - Batch 1000
Epoch 4 - Batch 1500
Epoch 4 - Batch 2000
Epoch 4 - Batch 2500
Epoch 4 - Batch 3000
Epoch 4 - Batch 3500
Epoch 4 - Batch 4000
Epoch 4 - Loss: 1.0155956024719663
ERKE:
What, are you the nurse? what is't within? When cheer'd,
Shrew'slepring at great sunshmating joys or weed'
Grages and white colours think.
KING RICHARD II:
That I am past she is contrive.
BALTHASAR:
I should have prevent you, if not she
Can you.
LUCIO:
Edward, who spoke'd your father, honour all,
Even to my crave before this wise and mercy,
I will be seen'st thou, sir; on Thoreso widow.
Widow:
No more: let them he hear wounds loar'd and always
Apolood by mighty dreadful trumpets palmsiffe,
I have herself chief and happy stroke,
I lawked themselves as thou art. Say no;
Bolingling to cherish weap: yet I did
Touch him and that even no course of Trosen!
KING RICHARD III:
Give me some brother's presence: be day eyes,
caps and labour.
How use but in readings, he shall thus for to do?
BAPTISTA:
Not so?
CLARENCE:
Mondly, my lords, as long and very great lies,
Virching smearing to the traitor, stard up Libun.
Take traitors in our joyft he follow so;
More money wond of this is true, to hand
with no questi
Epoch 5 - Batch 500
Epoch 5 - Batch 1000
Epoch 5 - Batch 1500
Epoch 5 - Batch 2000
Epoch 5 - Batch 2500
Epoch 5 - Batch 3000
Epoch 5 - Batch 3500
Epoch 5 - Batch 4000
Epoch 5 - Loss: 1.005847644420038
ather; when you shall not do our character.
HENRY BOLINGBROKE:
So may come from the maid, and fast your half, silenas. Forswox,
But I will flatter yet, every thoughts like,
Tears all the tires there be rush'd untill thy
parial shall we his putting herbents.
SICINIUS:
Wore with the breath of Time
Rveaves to bid thee, to this earth'st you
To wearn your fain, and that I may.
Gaoce:
away is Clarence, Signior Bolingbroke:
CAMILLO:
My lord,
This way, and lay alone, he did command:
An all-keeping my comfort like of what corn
canst, we derrer in the sea, what I in any consernical!
Which sad the town distress the earth to thine.
LORD ROSS:
The duke might here fit hard, and an old
shepherd, to seek by land and dreams to show what use
To pluck him a pedants and grief o' the busibess.
KING RICHARD II:
Up. ands your warrants, new come upon thee,
With that fixetingly pair in him!
Voldarn:
There's they are all at liging life.
And will you know thin so: hark. Now, marries in her watch
Mistre in Platicus?
DUKE OF AUME
Epoch 6 - Batch 500
Epoch 6 - Batch 1000
Epoch 6 - Batch 1500
Epoch 6 - Batch 2000
Epoch 6 - Batch 2500
Epoch 6 - Batch 3000
Epoch 6 - Batch 3500
Epoch 6 - Batch 4000
Epoch 6 - Loss: 1.002127003625952
RIANDLAND:
What cannot telant things to such a guest is che
Plach when the hollowly here to when their peace down
Though slanded shall not think and weeds but though
With chares: if you bett the duking honest wounds.
WARWICK:
Ten that mafees was almost call thee salte up;
The subjects; for for my husband dead the grace:
I'll halt a print that their match is fairly hearts
With batient ways on York; what should come
As you are welcom'd. In that beautious ire No
like women and but foreinst to's wedding.
Lord:
Grumbless, for my feeling Tranio.
BALTHAMIA:
And lessly help consumed for thy pale,
Call the utmost of his bold clears
And will between do fuslibe, and bring,
Ere he they our fearful prince shall have his;
I lear a poison, your offer can
let loss in peace no: the solicts shall prove him by;
Whose eyes rich few straight shall be war: and there! So so
should threaten Receit of that corovatians
Even in his life, so near the fardel again; yet I
wish'd it in this forward and among me
To meet I have been draw
Epoch 7 - Batch 500
Epoch 7 - Batch 1000
Epoch 7 - Batch 1500
Epoch 7 - Batch 2000
Epoch 7 - Batch 2500
Epoch 7 - Batch 3000
Epoch 7 - Batch 3500
Epoch 7 - Batch 4000
Epoch 7 - Loss: 1.0006365285650978
arewell;
And he was slain.
LORD BERK:
Wire she should try.
KATHARINA:
I have; affairs, if you keep so disgraced ere thou
Daught tunt, thou is through this night's sight!
Or feeling here, the time is like to sun
With all your happeal with soulled king:
Flirst, finde run, and send us thy linore's gnat
And at my outrant of great point withouts
But O horrake with Rutland news;
Therefore I see thee run a noboly by;
You hate these bones to that, in briel is the
rest and ingrate me in thy body.
LUCIO:
Armitak you, Kate, and my hoolds themselves
Where once is bone eashorm; and with Bianca
Broud a certain that maids custer; proudly
The oxced thou wasted against my good father:
His sunshmand was well assurefty as
We power to say 'Blessingeance lords, Elbow is
to kindly do examnter by the wing.
And yet I might hurt not here?
Not bloods come and truth it sife him women
With that rebellious life.
Officer:
No, so, pray, what of that?
EMILIA:
Too law us, I have let him be great,
I fear, his cause to do thee without.
A
Epoch 8 - Batch 500
Epoch 8 - Batch 1000
Epoch 8 - Batch 1500
Epoch 8 - Batch 2000
Epoch 8 - Batch 2500
Epoch 8 - Batch 3000
Epoch 8 - Batch 3500
Epoch 8 - Batch 4000
Epoch 8 - Loss: 0.9999360942971937
irst Citizen:
Do you have heard?
GLOUCESTER:
He is no other, and they virtue it.
BAGO:
ANTONIO:
When we both yet love for a holy learned.
Direct the fairest story followers piled eyes
Rovel I know the heavens, wedd this tongue chance the banks,
Who cannot be to rather farewell's sweet voice,
And I might king; but the desire they fall,
That sets the proportion should not pardon
Plank'd for demand, and to her legs.
MIRANDA:
What, ho! I resolved better:
I have her with the preparing them that
sughticuly, and kiss the wisdom and
Not the petition for them take't.
BIONDELLO:
Why, 'tis a poor father; if you were a prince's purks
you twenty father, how hath thy feignor day,
That is much morelier; in that have was fromated.
LEONTES:
Should take you more curt out for thee that all
continued with such grief, ambitious England and to
one than I am as brides it with Pompey? is ne'er she had we now backs
That Rome shall not read to death: but when
such darts? of God's worth in prosperity?
Nurse:
Alas, he done, awake
Epoch 9 - Batch 500
Epoch 9 - Batch 1000
Epoch 9 - Batch 1500
Epoch 9 - Batch 2000
Epoch 9 - Batch 2500
Epoch 9 - Batch 3000
Epoch 9 - Batch 3500
Epoch 9 - Batch 4000
Epoch 9 - Loss: 0.995901928405569
or pains Awhy; 'twas if a furty are but
and so becomes Behard Places pen in war weel
To cut from this allowing and damnable.
Bring me to your mind! Signior Lady go, I must.
How now! what's this? I think I should look'st thou out:
Thy brother's welping beautience! but be cut of
night?
QUEEN MARGARET:
Thrick main too-so, if an elfery shame,--
Abburning! Thou art these two in them in compound.
Third Servingman:
Do I, my lord; not King Edward: you mean
The conquest cunning creeping voice to tarn.
GLOUCESTER:
My lips weep mad. Once, got growing of Coriolanus:
Lend thy branches are a craft: hear I born.
The marshal partless Sometstard wherein no.
No be, you rogue he that fall from Pompes,
Resort your houses,--whypult'sl tempers
Dismisted him net again; insuech hear
Shall will I compasse-boddful sin; take it, 'em,
Or else regent, if nothing garlerfeits
At holy pent of Edward's hand yet bear
into the king's daughter be so convertly's face,
And having made thy sorrow, to have some night
Without cheer than the curre
Epoch 10 - Batch 500
Epoch 10 - Batch 1000
Epoch 10 - Batch 1500
Epoch 10 - Batch 2000
Epoch 10 - Batch 2500
Epoch 10 - Batch 3000
Epoch 10 - Batch 3500
Epoch 10 - Batch 4000
Epoch 10 - Loss: 0.9934289328158723
irst Murser Menevions must be gone.
HENRY BOLINGBROKE:
Lord, Claudio's vouch, I'll rather prefer it;
And where I am past children, gentle shall.
Come, you ship well 'scand'd in the title order.
First Citizen:
Why, that's good new most grief of thy nore.
Go when the suck sky and Montague,
To prosper Angelo with word with tears:
Seal up it doth disprease alonger: I,
though she be gone into proud. To one have sworn
We show yourself thou wilt not.
HENRY BOLINGBROKE:
O looking-fleeming fathom, will it not wash'd
With young arm of so intorant a custom with silver
By all thy odem traise early too:
What bit that find the easy carnable
Of what gentlemen 'tis through the sea fiffer a
vicer linger, each other is the man best greate the nurrish in
my pains his a merry instrumats fleding things not
To seed the next I pun up and make
By all that ever, so well assemble
To such Bohemia incline of good tribune
At harlowing nature stint again to keep
Because suchivos with him back again:
And shows it little in the grace of
Hi there, this does indeed seem improved - but I still see memory (resident/RES) climb into the multiple of gigabytes during the lifetime of the process. I'm watching "RES" (resident) in top while this process runs.
Interestingly enough - the growth during the first epoch hovers between 800 MB's - 1 GB - relatively stable.
And then I see about 500 mb's of growth from the first call to the sample function. If you look at the memstats from go - exp:
runtime.ReadMemStats(&m)
TotalAlloc from the above MemStats jumps by about 100 MB's before and after the call to the sample function.
Then interestingly - I see about 3 GB's of growth during the second epoch.
Unrelated - it would be very helpful if you could add the following method - if you're open to it - I'd be happy to create a PR for it:
func (tdi *TextDataIter) Progress() (float32) {
startIndex := (tdi.BatchIndex * tdi.BatchSize)
availableIndices := tdi.IndexesLen
progress := float32(startIndex) / float32(availableIndices)
return progress
}
@source-transformer,
Could you share in detail how did you test and find the memory usage still climbing up please? In the training example, we have 2 nested for loop
and it might cause a fluctuation of memory usage, if there was no leak, it would be plateau at some point.
I just tested it for 10 epochs on CUDA device and it seems to be stable (<2GB of CUDA memory usage).
Feel free to contribute new API. Thanks.
close this issue for now.
Sorry @sugarme - got busy with work.
If you are available to do a screenshare at some point - I can walk you thru what I'm doing in realtime.
In the meantime I could share the git project I have that spins up a Linux docker container and from there - you can run your example.
Hi there, I'm trying Gotch out in a go server application running on Linux.
I integrated this example application into one of my services: https://github.com/sugarme/gotch/blob/v0.7.0/example/char-rnn/main.go
My service quickly runs up into multiple GB's.
I've done some debugging and it seems it has to do with how tensor arrays are stacked into the tensor pointer. Specifically this statement:
is doing an allocation which is never being freed. Interestingly enough there is a commented out statement here:
which when uncommented results in a segmentation fault. I think the problem is in this function:
https://github.com/sugarme/gotch/blob/master/ts/tensor.go#L961
I believe this code needs to check if this is an array allocation and free each of the individual tensors.
I am testing out a change locally - but wanted to see if this is a known issue or if it is possible there is a work-around to this situation? I.E. the possibility this example application is just using an unsafe memory alloc/dealloc pattern.
Thanks!