#Define vs Const

New to FreeBASIC? Post your questions here.
StillLearning
Posts: 33
Joined: Aug 27, 2019 22:22

#Define vs Const

Postby StillLearning » Aug 10, 2020 19:39

I want to have a easy way to isolate bits of a byte.
I tried using #define to name all the bits and the using a #define for a mask. This works.
The other way is to use a Const for each bit of the byte. It also works.
Does a #Define use more or less memory than a Const?
Is a #Define faster than a Const?
Which way is better?

Example: using #Define

Code: Select all

'using #define
#Define Bit1  1
#Define Bit2  2
#Define Bit3  4
#Define Bit4  8
#Define Bit5  16
#Define Bit6  32
#Define Bit7  64
#Define Bit8 128
'other bytes currently not being used but can easily be expanded in the future

'create a mask. Others can easily be made.
#Define ComboBits  Bit1+Bit2+Bit3

Dim ColorByte  as UInteger

ColorByte = (ColorByte and ComboBits) 'isolate the first 3 bits.

end


Example using Const

Code: Select all

'using Const
Const Bit1  1
Const Bit2  2
Const Bit3  4
Const Bit4  8
Const Bit5  16
Const Bit6  32
Const Bit7  64
Const Bit8 128
'other bytes currently not being used but can easily be expanded in the future

'create a mask. Others can easily be made.
Const ComboBits Bit1+Bit2+Bit3

Dim ColorByte  as UInteger

ColorByte = (ColorByte and ComboBits) 'isolate the first 3 bits.

end


Is there a better way that easily allows changes without affecting the existing bits in the future?
bcohio2001
Posts: 553
Joined: Mar 10, 2007 15:44
Location: Ohio, USA
Contact:

Re: #Define vs Const

Postby bcohio2001 » Aug 16, 2020 19:17

The "Define" does a text replacement in the code before compilation. While "Const" creates a variable.

It's a matter of coding style of which you prefer.
D.J.Peters
Posts: 8172
Joined: May 28, 2005 3:28
Contact:

Re: #Define vs Const

Postby D.J.Peters » Aug 16, 2020 19:28

With const you can define the right datatype also !

' a 8 bit byte mask
const as ubyte mask8 = &HFF
' a 16 bit word mask
const as ushort mask16 = &HFFFFFF
...
' 8 bit masks
const as ubyte bit0 = 1 shl 0
const as ubyte bit1 = 1 shl 1
const as ubyte bit2 = 1 shl 2
const as ubyte bit3 = 1 shl 3
...

Joshy
counting_pine
Site Admin
Posts: 6225
Joined: Jul 05, 2005 17:32
Location: Manchester, Lancs

Re: #Define vs Const

Postby counting_pine » Aug 17, 2020 12:00

#define uses simple text replacement, which can have some weird effects because they don't respect operator precedence.

Code: Select all

const as ubyte bit1 = 1, bit2 = 2, bit3 = 4

const ComboBits       = bit1+bit2+bit3 ' = 7
#define ComboBitsDefine bit1+bit2+bit3

print ComboBits*3       ' = 21
print ComboBitsDefine*3 ' = 15?
I can't think of any good reason to use #define for constants, but I can think of several bad ones. So I'd recommend Const where possible.
dodicat
Posts: 6687
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: #Define vs Const

Postby dodicat » Aug 17, 2020 13:31

There is something wrong with 64 bit gcc.

Code: Select all

const four=sin(.5)^2+cos(.5)^2  + sin(.5)^2+cos(.5)^2 +sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
#define _four sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
dim shared as double four_=sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2



sub fbmain
dim as long x
for n as long=1 to 100000000
    rnd  'warm up
next


for k as long=1 to 5
    x=0
dim as double t=timer
for n as long=1 to 100000000
    x+=four
    next n
    print timer-t,x,"const"
 
 x=0
 t=timer
   
   for n as long=1 to 100000000
    x+=_four
    next n
print timer-t,x ,"define"

x=0
 t=timer
   
   for n as long=1 to 100000000
    x+=four_
    next n
print timer-t,x,"variable"
   
 x=0
 t=timer
   
   for n as long=1 to 100000000
    x+=sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2+sin(.5)^2+cos(.5)^2
    next n
print timer-t,x,"raw"
    print   
   
next k
print "Done"
end sub

fbmain

sleep

Whether or not the looping is inside a sub or on the main program, 64 bit gcc takes too long for the variable four_
32 bit gas/gcc and gas64 all OK.
marcov
Posts: 3010
Joined: Jun 16, 2005 9:45
Location: Eindhoven, NL
Contact:

Re: #Define vs Const

Postby marcov » Aug 17, 2020 14:47

Note a possible difference might also be the point where the type is fixated:

For const this can be by the parser at declaration (the whole const line), for untyped constants when just the literal is parsed. Either way, one declaration is usually the same for all.

#defines however depends on the context of usage, which can be both a blessing and a curse.
hhr
Posts: 8
Joined: Nov 29, 2019 10:41

Re: #Define vs Const

Postby hhr » Aug 17, 2020 19:30

@Dodicat
Can it be the conversion Long-Double?
dodicat
Posts: 6687
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: #Define vs Const

Postby dodicat » Aug 17, 2020 20:09

Hi hhr.
Yes it is the casting to long, it is too slow with gcc.
deltarho[1859]
Posts: 2611
Joined: Jan 02, 2017 0:34
Location: UK

Re: #Define vs Const

Postby deltarho[1859] » Aug 20, 2020 7:35

It is worth looking at: Symbolic Constants

I tried posting this a few days ago but the link wasn't working.

Return to “Beginners”

Who is online

Users browsing this forum: No registered users and 2 guests