Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add temperature and humidity values to biome #8

Open
wants to merge 11 commits into
base: develop
Choose a base branch
from

Conversation

RatMoleRat
Copy link

@RatMoleRat RatMoleRat commented Jun 28, 2020

This adds a temperature and humidity value to each CoreBiome - it's required for Terasology/ClimateConditions#18 to work. It also relies on Terasology/BiomesAPI#5.

Copy link
Contributor

@skaldarnar skaldarnar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good from the technical point of view now, but my concerns raised in Terasology/BiomesAPI#5 (comment) become apparent here.

We have an implicit conversion going on in BiomeProvider L68-77, interpreting the abstract facet temperature values directly as biomes. The relation of these numbers does not necessarily match with the relation of temperatures and humidity defined here.
On the other hand, maybe that's a good thing, and we should completely forget about the underlying facet temperature noise ... 🤔

OCEAN("Ocean"),
BEACH("Beach"),
PLAINS("Plains");
MOUNTAINS("Mountains", .3f, .09f),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks way better from the technical perspective now 👍

Copy link
Member

@Cervator Cervator left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems reasonable, just a couple trivial comments :-)

I haven't fully been up to speed with all this, but I guess with this PR we start tracking two new things per-block for just about any setup (since CoreWorlds is so fundamental). Are we concerned at all about memory usage / performance due to that change?

Rect2i processRegion = facet.getWorldRegion();
for (BaseVector2i position : processRegion.contents()) {
// clamp to reasonable values, just in case
float noiseAdjusted = TeraMath.clamp(humidityNoise.noise(position.x(), position.y()) + .6f, 0f,1f);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are reasonable values, anyway? Not sure what the numbers here mean, may be good to name them via constants? There's also a space missing after 0f, for a quick trivial style fix :-)

for (int i = 0; i < noise.length; ++i) {
noise[i] = TeraMath.clamp((noise[i] * 2.11f + 1f) * 0.5f);
for (BaseVector2i position : processRegion.contents()) {
// modify initial noise
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Modify it how though?

@Cervator
Copy link
Member

After some more testing I have a feeling there's a severe bug related to the extra data being added to CoreWorlds when a secondary client attempts to join a multiplayer game (leading to protobuf, network communication, and so on). Log below.

This does not happen with FlowingLiquids in a similar multiplayer setup, despite also using @ExtraDataSystem and so on. Probably because it uses it on a system class that's marked with @RegisterSystem(RegisterMode.AUTHORITY) (so it only runs on the server, not on clients), while you've got two of those annotations, one on a provider class and one on a rasterizer. In theory worldgen should also only happen on the server, but that still feels like an area that might be worth digging around in?

...
23:01:26.335 [main] INFO  o.t.e.modes.loadProcesses.JoinServer - Activating module: CombatSystem:1.0.0
23:01:26.335 [main] INFO  o.t.e.modes.loadProcesses.JoinServer - Activating module: NameGenerator:1.0.0
23:01:26.335 [main] INFO  o.t.e.modes.loadProcesses.JoinServer - Activating module: ItemRendering:1.1.0
23:01:37.107 [main] WARN  o.t.engine.internal.TimeBase - Delta too great (1300), capping to 1000
log4j:WARN No appenders could be found for logger (org.apache.http.impl.conn.PoolingHttpClientConnectionManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
23:01:43.641 [main] WARN  o.t.engine.internal.TimeBase - Delta too great (4148), capping to 1000
23:01:50.535 [main] WARN  o.t.engine.internal.TimeBase - Delta too great (3828), capping to 1000
23:03:15.626 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:24.524 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:33.301 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:38.201 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:41.961 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:46.942 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:52.367 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:03:58.502 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
Aug 30, 2020 11:04:00 PM org.jboss.netty.channel.socket.nio.AbstractNioSelector
WARNING: Unexpected exception in the selector loop.
java.lang.OutOfMemoryError: GC overhead limit exceeded

Aug 30, 2020 11:04:04 PM org.jboss.netty.channel.socket.nio.AbstractNioSelector
WARNING: Unexpected exception in the selector loop.
java.lang.OutOfMemoryError: GC overhead limit exceeded

23:04:20.474 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
java.lang.OutOfMemoryError: GC overhead limit exceeded
23:04:20.475 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
com.google.protobuf.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero).
	at com.google.protobuf.InvalidProtocolBufferException.invalidTag(InvalidProtocolBufferException.java:89)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:158)
	at org.terasology.protobuf.NetData$NetMessage.<init>(NetData.java:491)
	at org.terasology.protobuf.NetData$NetMessage.<init>(NetData.java:455)
	at org.terasology.protobuf.NetData$NetMessage$1.parsePartialFrom(NetData.java:789)
	at org.terasology.protobuf.NetData$NetMessage$1.parsePartialFrom(NetData.java:784)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:137)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:168)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:174)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.jboss.netty.handler.codec.protobuf.ProtobufDecoder.decode(ProtobufDecoder.java:122)
	at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:66)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.cleanup(FrameDecoder.java:482)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.channelDisconnected(FrameDecoder.java:365)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:102)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:60)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.cleanup(FrameDecoder.java:493)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.channelDisconnected(FrameDecoder.java:365)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:102)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.channel.SimpleChannelHandler.channelDisconnected(SimpleChannelHandler.java:199)
	at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:120)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
	at org.jboss.netty.channel.Channels.fireChannelDisconnected(Channels.java:396)
	at org.jboss.netty.channel.socket.nio.AbstractNioWorker.close(AbstractNioWorker.java:360)
	at org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.eventSunk(NioClientSocketPipelineSink.java:58)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:779)
	at org.jboss.netty.channel.SimpleChannelHandler.closeRequested(SimpleChannelHandler.java:334)
	at org.jboss.netty.channel.SimpleChannelHandler.handleDownstream(SimpleChannelHandler.java:260)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:784)
	at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:784)
	at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:582)
	at org.jboss.netty.channel.Channels.close(Channels.java:812)
	at org.jboss.netty.channel.AbstractChannel.close(AbstractChannel.java:206)
	at org.terasology.network.internal.ClientHandler.exceptionCaught(ClientHandler.java:65)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:60)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.exceptionCaught(FrameDecoder.java:377)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:60)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.exceptionCaught(FrameDecoder.java:377)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:112)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.channel.SimpleChannelHandler.exceptionCaught(SimpleChannelHandler.java:156)
	at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:130)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
	at org.jboss.netty.channel.Channels.fireExceptionCaught(Channels.java:525)
	at org.jboss.netty.channel.AbstractChannelSink.exceptionCaught(AbstractChannelSink.java:48)
	at org.jboss.netty.channel.DefaultChannelPipeline.notifyHandlerException(DefaultChannelPipeline.java:658)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:566)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
	at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:310)
	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
	at org.terasology.network.internal.MetricRecordingHandler.messageReceived(MetricRecordingHandler.java:45)
	at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
	at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
	at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
	at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
	at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
	at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
	at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
	at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
23:04:23.024 [New I/O worker #3] WARN  o.t.network.internal.ClientHandler - Unexpected exception from client
com.google.protobuf.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero).
	at com.google.protobuf.InvalidProtocolBufferException.invalidTag(InvalidProtocolBufferException.java:89)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:158)
...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants