Skip to content

Logger

LogLevel

Bases: Enum

Enum for log levels used by the Logger class.

Attributes:

Name Type Description
DEBUG

Detailed information, typically of interest only when diagnosing problems.

INFO

Confirmation that things are working as expected.

WARNING

An indication that something unexpected happened.

ERROR

A more serious problem, the software has not been able to perform some function.

CRITICAL

A serious error, indicating that the program itself may be unable to continue running.

Source code in opensourceleg/logging/logger.py
class LogLevel(Enum):
    """
    Enum for log levels used by the Logger class.

    Attributes:
        DEBUG: Detailed information, typically of interest only when diagnosing problems.
        INFO: Confirmation that things are working as expected.
        WARNING: An indication that something unexpected happened.
        ERROR: A more serious problem, the software has not been able to perform some function.
        CRITICAL: A serious error, indicating that the program itself may be unable to continue running.
    """

    DEBUG = logging.DEBUG
    INFO = logging.INFO
    WARNING = logging.WARNING
    ERROR = logging.ERROR
    CRITICAL = logging.CRITICAL

Logger

Bases: Logger

Represents a custom singleton logger class that extends the built-in Python logger. The logger provides additional functionality for tracking and logging variables to a CSV file. It supports different log levels and log formatting options.

Parameters:

Name Type Description Default
log_path str

The path to save log files.

'./'
log_format str

The log message format.

'[%(asctime)s] %(levelname)s: %(message)s'
file_level LogLevel

The log level for file output.

DEBUG
stream_level LogLevel

The log level for console output.

INFO
file_max_bytes int

The maximum size of the log file in bytes before rotation.

0
file_backup_count int

The number of backup log files to keep.

5
file_name Union[str, None]

The base name for the log file.

None
buffer_size int

The maximum number of log entries to buffer before writing to the CSV file.

1000
enable_csv_logging bool

Whether to enable CSV logging.

True
Properties
  • file_path: The path to the log file.
  • buffer_size: The maximum number of log entries to buffer.
  • file_level: The log level for file output.
  • stream_level: The log level for console output.
  • file_max_bytes: The maximum size of the log file in bytes before rotation.
  • file_backup_count: The number of backup log files to keep.
  • csv_logging_enabled: Whether CSV logging is enabled.
  • tracked_variable_count: The number of currently tracked variables.

Methods:

Name Description
- **track_variable**

Track a variable for logging.

- **flush_buffer**

Write the buffered log entries to the CSV file.

- **reset**

Reset the logger state.

- **close**

Close the logger and flush any remaining log entries.

- **debug**

Log a debug message.

- **info**

Log an info message.

- **warning**

Log a warning message.

- **error**

Log an error message.

- **critical**

Log a critical message.

- **log**

Log a message at a specific log level.

Examples:

>>> logger = Logger()
>>> logger.info("This is an info message")
[2022-01-01 12:00:00] INFO: This is an info message
>>> logger.debug("This is a debug message")
[2022-01-01 12:00:00] DEBUG: This is a debug message
>>> logger.track_variable(lambda: 42, "answer")
>>> logger.update()
>>> logger.flush_buffer()
Source code in opensourceleg/logging/logger.py
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
class Logger(logging.Logger):
    """
    Represents a custom singleton logger class that extends the built-in Python logger. The logger provides additional
    functionality for tracking and logging variables to a CSV file. It supports different log levels and log formatting
    options.

    Args:
        log_path (str): The path to save log files.
        log_format (str): The log message format.
        file_level (LogLevel): The log level for file output.
        stream_level (LogLevel): The log level for console output.
        file_max_bytes (int): The maximum size of the log file in bytes before rotation.
        file_backup_count (int): The number of backup log files to keep.
        file_name (Union[str, None]): The base name for the log file.
        buffer_size (int): The maximum number of log entries to buffer before writing to the CSV file.
        enable_csv_logging (bool): Whether to enable CSV logging.

    Properties:
        - **file_path**: The path to the log file.
        - **buffer_size**: The maximum number of log entries to buffer.
        - **file_level**: The log level for file output.
        - **stream_level**: The log level for console output.
        - **file_max_bytes**: The maximum size of the log file in bytes before rotation.
        - **file_backup_count**: The number of backup log files to keep.
        - **csv_logging_enabled**: Whether CSV logging is enabled.
        - **tracked_variable_count**: The number of currently tracked variables.

    Methods:
        - **track_variable**: Track a variable for logging.
        - **flush_buffer**: Write the buffered log entries to the CSV file.
        - **reset**: Reset the logger state.
        - **close**: Close the logger and flush any remaining log entries.
        - **debug**: Log a debug message.
        - **info**: Log an info message.
        - **warning**: Log a warning message.
        - **error**: Log an error message.
        - **critical**: Log a critical message.
        - **log**: Log a message at a specific log level.

    Examples:
        >>> logger = Logger()
        >>> logger.info("This is an info message")
        [2022-01-01 12:00:00] INFO: This is an info message
        >>> logger.debug("This is a debug message")
        [2022-01-01 12:00:00] DEBUG: This is a debug message

        >>> logger.track_variable(lambda: 42, "answer")
        >>> logger.update()
        >>> logger.flush_buffer()
    """

    _instance = None
    _lock = threading.RLock()  # Reentrant lock for thread safety

    def __new__(cls, *args: Any, **kwargs: Any) -> "Logger":
        """
        Ensure that only one instance of Logger is created (singleton pattern).

        Returns:
            Logger: The singleton Logger instance.
        """
        with cls._lock:
            if cls._instance is None:
                cls._instance = super().__new__(cls)
            else:
                logging.debug(f"Reusing existing Logger instance: {id(cls._instance)}")
        return cls._instance

    def __init__(
        self,
        log_path: str = "./",
        log_format: str = "[%(asctime)s] %(levelname)s: %(message)s",
        file_level: LogLevel = LogLevel.DEBUG,
        stream_level: LogLevel = LogLevel.INFO,
        file_max_bytes: int = 0,
        file_backup_count: int = 5,
        file_name: Union[str, None] = None,
        buffer_size: int = 1000,
        enable_csv_logging: bool = True,
    ) -> None:
        """
        Initialize the Logger instance.

        Sets up logging paths, format, handler levels, and internal buffers for tracking variables.

        Args:
            log_path (str): Directory path where log files will be stored.
            log_format (str): Format string for log messages.
            file_level (LogLevel): Logging level for file handler.
            stream_level (LogLevel): Logging level for stream (console) handler.
            file_max_bytes (int): Maximum size (in bytes) for log file rotation.
            file_backup_count (int): Number of backup log files to keep.
            file_name (Union[str, None]): Optional user-specified file name prefix.
            buffer_size (int): Maximum number of log records to buffer before writing to CSV.
            enable_csv_logging (bool): Whether to enable CSV logging.
        """
        with self._lock:
            if not hasattr(self, "_initialized"):
                super().__init__(__name__)
                self._log_path = log_path
                self._log_format = log_format
                self._file_level = file_level
                self._stream_level = stream_level
                self._file_max_bytes = file_max_bytes
                self._file_backup_count = file_backup_count
                self._user_file_name = file_name
                self._enable_csv_logging = enable_csv_logging

                self._file_path: str = ""
                self._csv_path: str = ""
                self._file: Optional[Any] = None
                self._writer = None
                self._is_logging = False
                self._header_written = False

                self._tracked_vars: dict[int, Callable[[], Any]] = {}
                self._var_names: dict[int, str] = {}
                self._buffer: deque = deque(maxlen=buffer_size)
                self._buffer_size: int = buffer_size
                self._error_count: dict[int, int] = {}  # Track errors per variable
                self._max_errors_before_untrack: int = 5  # Auto-untrack after this many errors

                try:
                    self._setup_logging()
                    self._initialized: bool = True
                except Exception as e:
                    print(f"Error initializing logger: {e}")
                    raise
            else:
                self.set_file_name(file_name)
                self.set_file_level(file_level)
                self.set_stream_level(stream_level)
                self.set_format(log_format)
                self._file_max_bytes = file_max_bytes
                self._file_backup_count = file_backup_count
                self.set_buffer_size(buffer_size)
                self._enable_csv_logging = enable_csv_logging
                self._log_path = log_path

    def _setup_logging(self) -> None:
        """
        Set up the stream logging handler.

        Configures the logger level, formatter, and attaches a stream handler for console output.
        """
        with self._lock:
            if not hasattr(self, "_stream_handler"):  # Prevent duplicate handlers
                self.setLevel(level=self._file_level.value)
                self._std_formatter = logging.Formatter(self._log_format)
                self._stream_handler = logging.StreamHandler()
                self._stream_handler.setLevel(level=self._stream_level.value)
                self._stream_handler.setFormatter(fmt=self._std_formatter)
                self.addHandler(hdlr=self._stream_handler)

    def set_stream_terminator(self, terminator: str) -> None:
        """
        Set the terminator for the stream handler.
        """
        with self._lock:
            self._stream_handler.terminator = terminator

    def _setup_file_handler(self) -> None:
        """
        Set up the file logging handler.
        """
        with self._lock:
            if not hasattr(self, "_file_handler"):  # Ensure file handler is added only once
                try:
                    self._generate_file_paths()

                    self._file_handler = RotatingFileHandler(
                        filename=self._file_path,
                        mode="w",
                        maxBytes=self._file_max_bytes,
                        backupCount=self._file_backup_count,
                        encoding="utf-8",
                    )
                    self._file_handler.setLevel(level=self._file_level.value)
                    self._file_handler.setFormatter(fmt=self._std_formatter)
                    self.addHandler(hdlr=self._file_handler)
                except Exception as e:
                    self.error(f"Failed to set up file handler: {e}")
                    # Fall back to console-only logging
                    self.warning("Falling back to console-only logging")

    def _ensure_file_handler(self) -> None:
        """
        Ensure that the file handler is set up.
        """
        with self._lock:
            if not hasattr(self, "_file_handler"):
                self._setup_file_handler()

    def track_variable(self, var_func: Callable[[], Any], name: str) -> None:
        """
        Record the value of a variable and log it to a CSV file.

        Args:
            var_func: A function that returns the value of the variable.
            name: The name of the variable.

        Examples:
            >>> class MyClass:
            ...     def __init__(self):
            ...         self.value = 42
            >>> obj = MyClass()
            >>> LOGGER.track_variable(lambda: obj.value, "answer")
            >>> LOGGER.update()
            >>> LOGGER.flush_buffer()
        """
        with self._lock:
            var_id = id(var_func)
            self._tracked_vars[var_id] = var_func
            self._var_names[var_id] = name
            self._error_count[var_id] = 0  # Initialize error count
            self.debug(f"Started tracking variable: {name}")

    def get_tracked_variables(self) -> list[tuple[str, Any]]:
        """
        Get a list of currently tracked variables and their current values.

        Returns:
            List[Tuple[str, Any]]: A list of tuples containing variable names and their current values.
        """
        with self._lock:
            result = []
            for var_id, get_value in self._tracked_vars.items():
                name = self._var_names.get(var_id, "unknown")
                try:
                    value = get_value()
                    result.append((name, value))
                except Exception as e:
                    result.append((name, f"ERROR: {e}"))
            return result

    def __repr__(self) -> str:
        """
        Return a string representation of the Logger instance.

        Returns:
            str: A string representation including the current file path and tracked variable count.
        """
        return f"Logger(file_path={self._file_path}, tracked_vars={len(self._tracked_vars)})"

    def set_file_name(self, file_name: Union[str, None]) -> None:
        """
        Set the base name for the log file.

        Args:
            file_name: The base name for the log file.

        Examples:
            >>> LOGGER.set_file_name("my_log_file")
            >>> LOGGER.file_path
            "./my_log_file.log"
        """
        with self._lock:
            try:
                # Ensure log directory exists
                os.makedirs(self._log_path, exist_ok=True)

                # Handle None file_name case
                if file_name is None:
                    # Generate default name if none provided
                    now = datetime.now()
                    timestamp = now.strftime("%Y%m%d_%H%M%S")
                    script_name = os.path.basename(__file__).split(".")[0]
                    file_name = f"{script_name}_{timestamp}"
                elif "." in file_name:
                    # If filename has an extension, remove it
                    file_name = file_name.split(".")[0]

                self._user_file_name = file_name
                self._file_path = os.path.join(self._log_path, f"{file_name}.log")
                self._csv_path = os.path.join(self._log_path, f"{file_name}.csv")

                # If we already have a file handler, we need to recreate it
                if hasattr(self, "_file_handler"):
                    self.removeHandler(self._file_handler)
                    self._file_handler.close()
                    del self._file_handler
                    self._setup_file_handler()

                # Reset CSV file if it exists
                if self._file:
                    self.close()
            except Exception as e:
                self.error(f"Error setting file name: {e}")
                raise

    def set_file_level(self, level: LogLevel) -> None:
        """
        Set the log level for file output.

        Args:
            level: The log level for file output.

        Examples:
            >>> LOGGER.set_file_level(LogLevel.INFO)
            >>> LOGGER.file_level
            LogLevel.INFO
            >>> LOGGER.debug("This is a debug message and will not be logged")
        """
        with self._lock:
            self._file_level = level
            if hasattr(self, "_file_handler"):
                self._file_handler.setLevel(level=level.value)

    def set_stream_level(self, level: LogLevel) -> None:
        """
        Set the log level for console output.

        Args:
            level: The log level for console output.

        Examples:
            >>> LOGGER.set_stream_level(LogLevel.INFO)
            >>> LOGGER.stream_level
            LogLevel.INFO
            >>> LOGGER.debug("This is a debug message and will not be streamed")
        """
        with self._lock:
            self._stream_level = level
            self._stream_handler.setLevel(level=level.value)

    def set_format(self, log_format: str) -> None:
        """
        Set the log message format. The format string uses the same syntax as the built-in Python logging module.

        Args:
            log_format: The log message format.

        Examples:
            >>> LOGGER.set_format("[%(asctime)s] %(levelname)s: %(message)s")
            >>> LOGGER.info("This is an info message")
            [2022-01-01 12:00:00] INFO: This is an info message
        """
        with self._lock:
            self._log_format = log_format
            self._std_formatter = logging.Formatter(log_format)
            if hasattr(self, "_file_handler"):
                self._file_handler.setFormatter(fmt=self._std_formatter)
            self._stream_handler.setFormatter(fmt=self._std_formatter)

    def set_buffer_size(self, buffer_size: int) -> None:
        """
        Set the maximum number of log entries to buffer before writing to the CSV file.

        Args:
            buffer_size: The maximum number of log entries to buffer.
        """
        with self._lock:
            if buffer_size <= 0:
                self.warning(f"Invalid buffer size: {buffer_size}. Using default of 1000.")
                buffer_size = 1000
            self._buffer_size = buffer_size
            # Create a new buffer with the updated size and copy over existing items
            old_buffer = list(self._buffer)
            self._buffer = deque(maxlen=buffer_size)
            for item in old_buffer:
                self._buffer.append(item)

    def set_csv_logging(self, enable: bool) -> None:
        """
        Enable or disable CSV logging.

        Args:
            enable (bool): Whether to enable CSV logging.
        """
        with self._lock:
            if self._enable_csv_logging != enable:
                self._enable_csv_logging = enable
                if not enable:
                    self.flush_buffer()
                    if self._file:
                        self._file.close()
                        self._file = None
                        self._writer = None
                self.debug(f"CSV logging {'enabled' if enable else 'disabled'}")

    def set_max_errors_before_untrack(self, max_errors: int) -> None:
        """
        Set the maximum number of errors before a variable is automatically untracked.

        Args:
            max_errors (int): Maximum number of errors before untracking.
        """
        with self._lock:
            if max_errors < 0:
                self.warning(f"Invalid max_errors value: {max_errors}. Using default of 5.")
                max_errors = 5
            self._max_errors_before_untrack = max_errors

    def update(self) -> None:
        """
        Update the logger by logging the current values of tracked variables to the buffer.

        Examples:
            >>> class MyClass:
            ...     def __init__(self):
            ...         self.value = 42
            >>> obj = MyClass()
            >>> LOGGER.track_variable(lambda: obj.value, "answer")
            >>> LOGGER.update()
        """
        if not self._tracked_vars or not self._enable_csv_logging:
            return

        with self._lock:
            data = []
            vars_to_untrack = []

            for var_id, get_value in self._tracked_vars.items():
                try:
                    value = get_value()
                    data.append(str(value))
                    # Reset error count on successful retrieval
                    self._error_count[var_id] = 0
                except Exception as e:
                    var_name = self._var_names.get(var_id, "unknown")
                    self.warning(f"Error getting value for {var_name}: {e}")
                    data.append("ERROR")

                    # Increment error count and check if we should untrack
                    self._error_count[var_id] = self._error_count.get(var_id, 0) + 1
                    if self._error_count[var_id] >= self._max_errors_before_untrack:
                        vars_to_untrack.append((var_id, var_name))

            # Only add data if we have variables to track
            if data:
                self._buffer.append(data)

            # Untrack variables with too many errors
            for var_id, var_name in vars_to_untrack:
                self._tracked_vars.pop(var_id, None)
                self._var_names.pop(var_id, None)
                self._error_count.pop(var_id, None)
                self.warning(
                    f"Auto-untracked variable {var_name} after {self._max_errors_before_untrack} consecutive errors"
                )

            if len(self._buffer) >= self._buffer_size:
                self.flush_buffer()

    def flush_buffer(self) -> None:
        """
        Flush the buffered log data to the CSV file.

        Ensures that the file handler is available, writes the header if not yet written,
        writes all buffered rows to the CSV, clears the buffer, and flushes the file.
        """
        if not self._buffer or not self._enable_csv_logging:
            return

        with self._lock:
            try:
                self._ensure_file_handler()

                if self._file is None:
                    try:
                        self._file = open(self._csv_path, "w", newline="")
                        self._writer = csv.writer(self._file)  # type: ignore[assignment]
                    except Exception as e:
                        self.error(f"Failed to open CSV file {self._csv_path}: {e}")
                        # Clear buffer to prevent memory buildup
                        self._buffer.clear()
                        return

                if not self._header_written:
                    self._write_header()

                try:
                    self._writer.writerows(self._buffer)  # type: ignore[attr-defined]
                    self._buffer.clear()
                    self._file.flush()
                except Exception as e:
                    self.error(f"Failed to write to CSV file: {e}")
                    # Try to recover by reopening the file
                    if self._file:
                        with contextlib.suppress(Exception):
                            self._file.close()
                    self._file = None
                    self._writer = None
                    self._header_written = False
            except Exception as e:
                self.error(f"Unexpected error in flush_buffer: {e}")

    def _write_header(self) -> None:
        """
        Write the CSV header based on tracked variable names.
        This header is written only once per log file.
        """
        try:
            header = list(self._var_names.values())
            if header:  # Only write header if we have variables
                self._writer.writerow(header)  # type: ignore[attr-defined]
                self._header_written = True
        except Exception as e:
            self.error(f"Failed to write CSV header: {e}")

    def _generate_file_paths(self) -> None:
        """
        Generate file paths for the log and CSV files based on the current settings.

        Creates the log directory if it does not exist, and uses the current timestamp
        (and optionally a user-specified name) to generate unique file names.
        """
        try:
            # Ensure log directory exists
            os.makedirs(self._log_path, exist_ok=True)

            now = datetime.now()
            timestamp = now.strftime("%Y%m%d_%H%M%S")
            script_name = os.path.basename(__file__).split(".")[0]

            base_name = self._user_file_name if self._user_file_name else f"{script_name}_{timestamp}"

            file_path = os.path.join(self._log_path, base_name)
            self._file_path = file_path + ".log"
            self._csv_path = file_path + ".csv"
        except Exception as e:
            print(f"Error generating file paths: {e}")  # Use print as logger might not be ready
            raise

    def __del__(self) -> None:
        """
        Destructor for the Logger class.
        """
        self.close()

    def __enter__(self) -> "Logger":
        """
        Enter the runtime context related to this Logger instance.

        Returns:
            Logger: The current Logger instance.
        """
        return self

    def __exit__(self, exc_type: Any, exc_val: Any, exc_tb: Any) -> None:
        """
        Exit the runtime context and close the Logger.

        Args:
            exc_type (Any): The exception type if an exception occurred.
            exc_val (Any): The exception value if an exception occurred.
            exc_tb (Any): The traceback if an exception occurred.
        """
        self.close()

    def reset(self) -> None:
        """
        Reset the Logger.

        Closes the current file, reinitializes the logging handlers, clears tracked variables,
        and resets header status.
        """
        with self._lock:
            try:
                self.close()

                # Remove and clean up handlers
                if hasattr(self, "_file_handler"):
                    self.removeHandler(self._file_handler)
                    self._file_handler.close()
                    del self._file_handler

                if hasattr(self, "_stream_handler"):
                    self.removeHandler(self._stream_handler)
                    self._stream_handler.close()  # Close the stream handler
                    del self._stream_handler  # Delete the attribute

                # Reinitialize logging
                self._setup_logging()

                # Reset tracking and state variables
                self._tracked_vars.clear()
                self._var_names.clear()
                self._error_count.clear()
                self._header_written = False
                self._file = None
                self._writer = None

                self.debug("Logger reset successfully")
            except Exception as e:
                print(f"Error resetting logger: {e}")  # Use print as logger might be in bad state

    def close(self) -> None:
        """
        Flush any buffered log data and close the CSV file.

        This method should be called before the program exits to ensure all data is written.
        Close the logger and flush any remaining log entries.

        Examples:
            >>> LOGGER.close()
            >>> LOGGER.info("This message will not be logged")
        """
        with self._lock:
            try:
                self.flush_buffer()
                if self._file:
                    self._file.close()
                    self._file = None
                    self._writer = None
            except Exception as e:
                self.error(f"Error closing logger: {e}")

    def debug(self, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log a debug message.

        Ensures that the file handler is set up before logging.

        Args:
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().debug(msg, *args, **kwargs)

    def info(self, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log an info message.

        Ensures that the file handler is set up before logging.

        Args:
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().info(msg, *args, **kwargs)

    def warning(self, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log a warning message.

        Ensures that the file handler is set up before logging.

        Args:
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().warning(msg, *args, **kwargs)

    def error(self, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log an error message.

        Ensures that the file handler is set up before logging.

        Args:
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().error(msg, *args, **kwargs)

    def critical(self, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log a critical message.

        Ensures that the file handler is set up before logging.

        Args:
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().critical(msg, *args, **kwargs)

    def log(self, level: int, msg: object, *args: object, **kwargs: Any) -> None:
        """
        Log a message with a specific log level.

        Ensures that the file handler is set up before logging.

        Args:
            level (int): The log level.
            msg (object): The message to log.
            *args (object): Additional arguments.
            **kwargs (Any): Additional keyword arguments.
        """
        self._ensure_file_handler()
        super().log(level, msg, *args, **kwargs)

    @property
    def log_format(self) -> str:
        """
        Get the current log format.
        """
        return self._log_format

    @property
    def file_name(self) -> Optional[str]:
        """
        Get the current file name.
        """
        return self._user_file_name

    @property
    def file_path(self) -> Optional[str]:
        """
        Get the current file path for the log file.

        Returns:
            Optional[str]: The file path as a string, or None if not set.
        """
        return self._file_path

    @property
    def csv_path(self) -> Optional[str]:
        """
        Get the current file path for the CSV file.

        Returns:
            Optional[str]: The CSV file path as a string, or None if not set.
        """
        return self._csv_path

    @property
    def log_path(self) -> str:
        """
        Get the log directory path.

        Returns:
            str: The directory path where log files are stored.
        """
        return self._log_path

    @property
    def buffer_size(self) -> int:
        """
        Get the current buffer size.

        Returns:
            int: The maximum number of log records held in the buffer.
        """
        return self._buffer_size

    @property
    def file_level(self) -> LogLevel:
        """
        Get the current file logging level.

        Returns:
            LogLevel: The logging level for the file handler.
        """
        return self._file_level

    @property
    def stream_level(self) -> LogLevel:
        """
        Get the current stream (console) logging level.

        Returns:
            LogLevel: The logging level for the stream handler.
        """
        return self._stream_level

    @property
    def file_max_bytes(self) -> int:
        """
        Get the maximum number of bytes for the log file before rotation.

        Returns:
            int: The maximum file size in bytes.
        """
        return self._file_max_bytes

    @property
    def file_backup_count(self) -> int:
        """
        Get the number of backup log files to keep.

        Returns:
            int: The backup count.
        """
        return self._file_backup_count

    @property
    def csv_logging_enabled(self) -> bool:
        """
        Get whether CSV logging is enabled.

        Returns:
            bool: Whether CSV logging is enabled.
        """
        return self._enable_csv_logging

    @property
    def tracked_variable_count(self) -> int:
        """
        Get the number of currently tracked variables.

        Returns:
            int: The number of tracked variables.
        """
        return len(self._tracked_vars)

buffer_size: int property

Get the current buffer size.

Returns:

Name Type Description
int int

The maximum number of log records held in the buffer.

csv_logging_enabled: bool property

Get whether CSV logging is enabled.

Returns:

Name Type Description
bool bool

Whether CSV logging is enabled.

csv_path: Optional[str] property

Get the current file path for the CSV file.

Returns:

Type Description
Optional[str]

Optional[str]: The CSV file path as a string, or None if not set.

file_backup_count: int property

Get the number of backup log files to keep.

Returns:

Name Type Description
int int

The backup count.

file_level: LogLevel property

Get the current file logging level.

Returns:

Name Type Description
LogLevel LogLevel

The logging level for the file handler.

file_max_bytes: int property

Get the maximum number of bytes for the log file before rotation.

Returns:

Name Type Description
int int

The maximum file size in bytes.

file_name: Optional[str] property

Get the current file name.

file_path: Optional[str] property

Get the current file path for the log file.

Returns:

Type Description
Optional[str]

Optional[str]: The file path as a string, or None if not set.

log_format: str property

Get the current log format.

log_path: str property

Get the log directory path.

Returns:

Name Type Description
str str

The directory path where log files are stored.

stream_level: LogLevel property

Get the current stream (console) logging level.

Returns:

Name Type Description
LogLevel LogLevel

The logging level for the stream handler.

tracked_variable_count: int property

Get the number of currently tracked variables.

Returns:

Name Type Description
int int

The number of tracked variables.

__del__()

Destructor for the Logger class.

Source code in opensourceleg/logging/logger.py
def __del__(self) -> None:
    """
    Destructor for the Logger class.
    """
    self.close()

__enter__()

Enter the runtime context related to this Logger instance.

Returns:

Name Type Description
Logger Logger

The current Logger instance.

Source code in opensourceleg/logging/logger.py
def __enter__(self) -> "Logger":
    """
    Enter the runtime context related to this Logger instance.

    Returns:
        Logger: The current Logger instance.
    """
    return self

__exit__(exc_type, exc_val, exc_tb)

Exit the runtime context and close the Logger.

Parameters:

Name Type Description Default
exc_type Any

The exception type if an exception occurred.

required
exc_val Any

The exception value if an exception occurred.

required
exc_tb Any

The traceback if an exception occurred.

required
Source code in opensourceleg/logging/logger.py
def __exit__(self, exc_type: Any, exc_val: Any, exc_tb: Any) -> None:
    """
    Exit the runtime context and close the Logger.

    Args:
        exc_type (Any): The exception type if an exception occurred.
        exc_val (Any): The exception value if an exception occurred.
        exc_tb (Any): The traceback if an exception occurred.
    """
    self.close()

__init__(log_path='./', log_format='[%(asctime)s] %(levelname)s: %(message)s', file_level=LogLevel.DEBUG, stream_level=LogLevel.INFO, file_max_bytes=0, file_backup_count=5, file_name=None, buffer_size=1000, enable_csv_logging=True)

Initialize the Logger instance.

Sets up logging paths, format, handler levels, and internal buffers for tracking variables.

Parameters:

Name Type Description Default
log_path str

Directory path where log files will be stored.

'./'
log_format str

Format string for log messages.

'[%(asctime)s] %(levelname)s: %(message)s'
file_level LogLevel

Logging level for file handler.

DEBUG
stream_level LogLevel

Logging level for stream (console) handler.

INFO
file_max_bytes int

Maximum size (in bytes) for log file rotation.

0
file_backup_count int

Number of backup log files to keep.

5
file_name Union[str, None]

Optional user-specified file name prefix.

None
buffer_size int

Maximum number of log records to buffer before writing to CSV.

1000
enable_csv_logging bool

Whether to enable CSV logging.

True
Source code in opensourceleg/logging/logger.py
def __init__(
    self,
    log_path: str = "./",
    log_format: str = "[%(asctime)s] %(levelname)s: %(message)s",
    file_level: LogLevel = LogLevel.DEBUG,
    stream_level: LogLevel = LogLevel.INFO,
    file_max_bytes: int = 0,
    file_backup_count: int = 5,
    file_name: Union[str, None] = None,
    buffer_size: int = 1000,
    enable_csv_logging: bool = True,
) -> None:
    """
    Initialize the Logger instance.

    Sets up logging paths, format, handler levels, and internal buffers for tracking variables.

    Args:
        log_path (str): Directory path where log files will be stored.
        log_format (str): Format string for log messages.
        file_level (LogLevel): Logging level for file handler.
        stream_level (LogLevel): Logging level for stream (console) handler.
        file_max_bytes (int): Maximum size (in bytes) for log file rotation.
        file_backup_count (int): Number of backup log files to keep.
        file_name (Union[str, None]): Optional user-specified file name prefix.
        buffer_size (int): Maximum number of log records to buffer before writing to CSV.
        enable_csv_logging (bool): Whether to enable CSV logging.
    """
    with self._lock:
        if not hasattr(self, "_initialized"):
            super().__init__(__name__)
            self._log_path = log_path
            self._log_format = log_format
            self._file_level = file_level
            self._stream_level = stream_level
            self._file_max_bytes = file_max_bytes
            self._file_backup_count = file_backup_count
            self._user_file_name = file_name
            self._enable_csv_logging = enable_csv_logging

            self._file_path: str = ""
            self._csv_path: str = ""
            self._file: Optional[Any] = None
            self._writer = None
            self._is_logging = False
            self._header_written = False

            self._tracked_vars: dict[int, Callable[[], Any]] = {}
            self._var_names: dict[int, str] = {}
            self._buffer: deque = deque(maxlen=buffer_size)
            self._buffer_size: int = buffer_size
            self._error_count: dict[int, int] = {}  # Track errors per variable
            self._max_errors_before_untrack: int = 5  # Auto-untrack after this many errors

            try:
                self._setup_logging()
                self._initialized: bool = True
            except Exception as e:
                print(f"Error initializing logger: {e}")
                raise
        else:
            self.set_file_name(file_name)
            self.set_file_level(file_level)
            self.set_stream_level(stream_level)
            self.set_format(log_format)
            self._file_max_bytes = file_max_bytes
            self._file_backup_count = file_backup_count
            self.set_buffer_size(buffer_size)
            self._enable_csv_logging = enable_csv_logging
            self._log_path = log_path

__new__(*args, **kwargs)

Ensure that only one instance of Logger is created (singleton pattern).

Returns:

Name Type Description
Logger Logger

The singleton Logger instance.

Source code in opensourceleg/logging/logger.py
def __new__(cls, *args: Any, **kwargs: Any) -> "Logger":
    """
    Ensure that only one instance of Logger is created (singleton pattern).

    Returns:
        Logger: The singleton Logger instance.
    """
    with cls._lock:
        if cls._instance is None:
            cls._instance = super().__new__(cls)
        else:
            logging.debug(f"Reusing existing Logger instance: {id(cls._instance)}")
    return cls._instance

__repr__()

Return a string representation of the Logger instance.

Returns:

Name Type Description
str str

A string representation including the current file path and tracked variable count.

Source code in opensourceleg/logging/logger.py
def __repr__(self) -> str:
    """
    Return a string representation of the Logger instance.

    Returns:
        str: A string representation including the current file path and tracked variable count.
    """
    return f"Logger(file_path={self._file_path}, tracked_vars={len(self._tracked_vars)})"

close()

Flush any buffered log data and close the CSV file.

This method should be called before the program exits to ensure all data is written. Close the logger and flush any remaining log entries.

Examples:

>>> LOGGER.close()
>>> LOGGER.info("This message will not be logged")
Source code in opensourceleg/logging/logger.py
def close(self) -> None:
    """
    Flush any buffered log data and close the CSV file.

    This method should be called before the program exits to ensure all data is written.
    Close the logger and flush any remaining log entries.

    Examples:
        >>> LOGGER.close()
        >>> LOGGER.info("This message will not be logged")
    """
    with self._lock:
        try:
            self.flush_buffer()
            if self._file:
                self._file.close()
                self._file = None
                self._writer = None
        except Exception as e:
            self.error(f"Error closing logger: {e}")

critical(msg, *args, **kwargs)

Log a critical message.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def critical(self, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log a critical message.

    Ensures that the file handler is set up before logging.

    Args:
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().critical(msg, *args, **kwargs)

debug(msg, *args, **kwargs)

Log a debug message.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def debug(self, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log a debug message.

    Ensures that the file handler is set up before logging.

    Args:
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().debug(msg, *args, **kwargs)

error(msg, *args, **kwargs)

Log an error message.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def error(self, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log an error message.

    Ensures that the file handler is set up before logging.

    Args:
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().error(msg, *args, **kwargs)

flush_buffer()

Flush the buffered log data to the CSV file.

Ensures that the file handler is available, writes the header if not yet written, writes all buffered rows to the CSV, clears the buffer, and flushes the file.

Source code in opensourceleg/logging/logger.py
def flush_buffer(self) -> None:
    """
    Flush the buffered log data to the CSV file.

    Ensures that the file handler is available, writes the header if not yet written,
    writes all buffered rows to the CSV, clears the buffer, and flushes the file.
    """
    if not self._buffer or not self._enable_csv_logging:
        return

    with self._lock:
        try:
            self._ensure_file_handler()

            if self._file is None:
                try:
                    self._file = open(self._csv_path, "w", newline="")
                    self._writer = csv.writer(self._file)  # type: ignore[assignment]
                except Exception as e:
                    self.error(f"Failed to open CSV file {self._csv_path}: {e}")
                    # Clear buffer to prevent memory buildup
                    self._buffer.clear()
                    return

            if not self._header_written:
                self._write_header()

            try:
                self._writer.writerows(self._buffer)  # type: ignore[attr-defined]
                self._buffer.clear()
                self._file.flush()
            except Exception as e:
                self.error(f"Failed to write to CSV file: {e}")
                # Try to recover by reopening the file
                if self._file:
                    with contextlib.suppress(Exception):
                        self._file.close()
                self._file = None
                self._writer = None
                self._header_written = False
        except Exception as e:
            self.error(f"Unexpected error in flush_buffer: {e}")

get_tracked_variables()

Get a list of currently tracked variables and their current values.

Returns:

Type Description
list[tuple[str, Any]]

List[Tuple[str, Any]]: A list of tuples containing variable names and their current values.

Source code in opensourceleg/logging/logger.py
def get_tracked_variables(self) -> list[tuple[str, Any]]:
    """
    Get a list of currently tracked variables and their current values.

    Returns:
        List[Tuple[str, Any]]: A list of tuples containing variable names and their current values.
    """
    with self._lock:
        result = []
        for var_id, get_value in self._tracked_vars.items():
            name = self._var_names.get(var_id, "unknown")
            try:
                value = get_value()
                result.append((name, value))
            except Exception as e:
                result.append((name, f"ERROR: {e}"))
        return result

info(msg, *args, **kwargs)

Log an info message.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def info(self, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log an info message.

    Ensures that the file handler is set up before logging.

    Args:
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().info(msg, *args, **kwargs)

log(level, msg, *args, **kwargs)

Log a message with a specific log level.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
level int

The log level.

required
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def log(self, level: int, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log a message with a specific log level.

    Ensures that the file handler is set up before logging.

    Args:
        level (int): The log level.
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().log(level, msg, *args, **kwargs)

reset()

Reset the Logger.

Closes the current file, reinitializes the logging handlers, clears tracked variables, and resets header status.

Source code in opensourceleg/logging/logger.py
def reset(self) -> None:
    """
    Reset the Logger.

    Closes the current file, reinitializes the logging handlers, clears tracked variables,
    and resets header status.
    """
    with self._lock:
        try:
            self.close()

            # Remove and clean up handlers
            if hasattr(self, "_file_handler"):
                self.removeHandler(self._file_handler)
                self._file_handler.close()
                del self._file_handler

            if hasattr(self, "_stream_handler"):
                self.removeHandler(self._stream_handler)
                self._stream_handler.close()  # Close the stream handler
                del self._stream_handler  # Delete the attribute

            # Reinitialize logging
            self._setup_logging()

            # Reset tracking and state variables
            self._tracked_vars.clear()
            self._var_names.clear()
            self._error_count.clear()
            self._header_written = False
            self._file = None
            self._writer = None

            self.debug("Logger reset successfully")
        except Exception as e:
            print(f"Error resetting logger: {e}")  # Use print as logger might be in bad state

set_buffer_size(buffer_size)

Set the maximum number of log entries to buffer before writing to the CSV file.

Parameters:

Name Type Description Default
buffer_size int

The maximum number of log entries to buffer.

required
Source code in opensourceleg/logging/logger.py
def set_buffer_size(self, buffer_size: int) -> None:
    """
    Set the maximum number of log entries to buffer before writing to the CSV file.

    Args:
        buffer_size: The maximum number of log entries to buffer.
    """
    with self._lock:
        if buffer_size <= 0:
            self.warning(f"Invalid buffer size: {buffer_size}. Using default of 1000.")
            buffer_size = 1000
        self._buffer_size = buffer_size
        # Create a new buffer with the updated size and copy over existing items
        old_buffer = list(self._buffer)
        self._buffer = deque(maxlen=buffer_size)
        for item in old_buffer:
            self._buffer.append(item)

set_csv_logging(enable)

Enable or disable CSV logging.

Parameters:

Name Type Description Default
enable bool

Whether to enable CSV logging.

required
Source code in opensourceleg/logging/logger.py
def set_csv_logging(self, enable: bool) -> None:
    """
    Enable or disable CSV logging.

    Args:
        enable (bool): Whether to enable CSV logging.
    """
    with self._lock:
        if self._enable_csv_logging != enable:
            self._enable_csv_logging = enable
            if not enable:
                self.flush_buffer()
                if self._file:
                    self._file.close()
                    self._file = None
                    self._writer = None
            self.debug(f"CSV logging {'enabled' if enable else 'disabled'}")

set_file_level(level)

Set the log level for file output.

Parameters:

Name Type Description Default
level LogLevel

The log level for file output.

required

Examples:

>>> LOGGER.set_file_level(LogLevel.INFO)
>>> LOGGER.file_level
LogLevel.INFO
>>> LOGGER.debug("This is a debug message and will not be logged")
Source code in opensourceleg/logging/logger.py
def set_file_level(self, level: LogLevel) -> None:
    """
    Set the log level for file output.

    Args:
        level: The log level for file output.

    Examples:
        >>> LOGGER.set_file_level(LogLevel.INFO)
        >>> LOGGER.file_level
        LogLevel.INFO
        >>> LOGGER.debug("This is a debug message and will not be logged")
    """
    with self._lock:
        self._file_level = level
        if hasattr(self, "_file_handler"):
            self._file_handler.setLevel(level=level.value)

set_file_name(file_name)

Set the base name for the log file.

Parameters:

Name Type Description Default
file_name Union[str, None]

The base name for the log file.

required

Examples:

>>> LOGGER.set_file_name("my_log_file")
>>> LOGGER.file_path
"./my_log_file.log"
Source code in opensourceleg/logging/logger.py
def set_file_name(self, file_name: Union[str, None]) -> None:
    """
    Set the base name for the log file.

    Args:
        file_name: The base name for the log file.

    Examples:
        >>> LOGGER.set_file_name("my_log_file")
        >>> LOGGER.file_path
        "./my_log_file.log"
    """
    with self._lock:
        try:
            # Ensure log directory exists
            os.makedirs(self._log_path, exist_ok=True)

            # Handle None file_name case
            if file_name is None:
                # Generate default name if none provided
                now = datetime.now()
                timestamp = now.strftime("%Y%m%d_%H%M%S")
                script_name = os.path.basename(__file__).split(".")[0]
                file_name = f"{script_name}_{timestamp}"
            elif "." in file_name:
                # If filename has an extension, remove it
                file_name = file_name.split(".")[0]

            self._user_file_name = file_name
            self._file_path = os.path.join(self._log_path, f"{file_name}.log")
            self._csv_path = os.path.join(self._log_path, f"{file_name}.csv")

            # If we already have a file handler, we need to recreate it
            if hasattr(self, "_file_handler"):
                self.removeHandler(self._file_handler)
                self._file_handler.close()
                del self._file_handler
                self._setup_file_handler()

            # Reset CSV file if it exists
            if self._file:
                self.close()
        except Exception as e:
            self.error(f"Error setting file name: {e}")
            raise

set_format(log_format)

Set the log message format. The format string uses the same syntax as the built-in Python logging module.

Parameters:

Name Type Description Default
log_format str

The log message format.

required

Examples:

>>> LOGGER.set_format("[%(asctime)s] %(levelname)s: %(message)s")
>>> LOGGER.info("This is an info message")
[2022-01-01 12:00:00] INFO: This is an info message
Source code in opensourceleg/logging/logger.py
def set_format(self, log_format: str) -> None:
    """
    Set the log message format. The format string uses the same syntax as the built-in Python logging module.

    Args:
        log_format: The log message format.

    Examples:
        >>> LOGGER.set_format("[%(asctime)s] %(levelname)s: %(message)s")
        >>> LOGGER.info("This is an info message")
        [2022-01-01 12:00:00] INFO: This is an info message
    """
    with self._lock:
        self._log_format = log_format
        self._std_formatter = logging.Formatter(log_format)
        if hasattr(self, "_file_handler"):
            self._file_handler.setFormatter(fmt=self._std_formatter)
        self._stream_handler.setFormatter(fmt=self._std_formatter)

set_max_errors_before_untrack(max_errors)

Set the maximum number of errors before a variable is automatically untracked.

Parameters:

Name Type Description Default
max_errors int

Maximum number of errors before untracking.

required
Source code in opensourceleg/logging/logger.py
def set_max_errors_before_untrack(self, max_errors: int) -> None:
    """
    Set the maximum number of errors before a variable is automatically untracked.

    Args:
        max_errors (int): Maximum number of errors before untracking.
    """
    with self._lock:
        if max_errors < 0:
            self.warning(f"Invalid max_errors value: {max_errors}. Using default of 5.")
            max_errors = 5
        self._max_errors_before_untrack = max_errors

set_stream_level(level)

Set the log level for console output.

Parameters:

Name Type Description Default
level LogLevel

The log level for console output.

required

Examples:

>>> LOGGER.set_stream_level(LogLevel.INFO)
>>> LOGGER.stream_level
LogLevel.INFO
>>> LOGGER.debug("This is a debug message and will not be streamed")
Source code in opensourceleg/logging/logger.py
def set_stream_level(self, level: LogLevel) -> None:
    """
    Set the log level for console output.

    Args:
        level: The log level for console output.

    Examples:
        >>> LOGGER.set_stream_level(LogLevel.INFO)
        >>> LOGGER.stream_level
        LogLevel.INFO
        >>> LOGGER.debug("This is a debug message and will not be streamed")
    """
    with self._lock:
        self._stream_level = level
        self._stream_handler.setLevel(level=level.value)

set_stream_terminator(terminator)

Set the terminator for the stream handler.

Source code in opensourceleg/logging/logger.py
def set_stream_terminator(self, terminator: str) -> None:
    """
    Set the terminator for the stream handler.
    """
    with self._lock:
        self._stream_handler.terminator = terminator

track_variable(var_func, name)

Record the value of a variable and log it to a CSV file.

Parameters:

Name Type Description Default
var_func Callable[[], Any]

A function that returns the value of the variable.

required
name str

The name of the variable.

required

Examples:

>>> class MyClass:
...     def __init__(self):
...         self.value = 42
>>> obj = MyClass()
>>> LOGGER.track_variable(lambda: obj.value, "answer")
>>> LOGGER.update()
>>> LOGGER.flush_buffer()
Source code in opensourceleg/logging/logger.py
def track_variable(self, var_func: Callable[[], Any], name: str) -> None:
    """
    Record the value of a variable and log it to a CSV file.

    Args:
        var_func: A function that returns the value of the variable.
        name: The name of the variable.

    Examples:
        >>> class MyClass:
        ...     def __init__(self):
        ...         self.value = 42
        >>> obj = MyClass()
        >>> LOGGER.track_variable(lambda: obj.value, "answer")
        >>> LOGGER.update()
        >>> LOGGER.flush_buffer()
    """
    with self._lock:
        var_id = id(var_func)
        self._tracked_vars[var_id] = var_func
        self._var_names[var_id] = name
        self._error_count[var_id] = 0  # Initialize error count
        self.debug(f"Started tracking variable: {name}")

update()

Update the logger by logging the current values of tracked variables to the buffer.

Examples:

>>> class MyClass:
...     def __init__(self):
...         self.value = 42
>>> obj = MyClass()
>>> LOGGER.track_variable(lambda: obj.value, "answer")
>>> LOGGER.update()
Source code in opensourceleg/logging/logger.py
def update(self) -> None:
    """
    Update the logger by logging the current values of tracked variables to the buffer.

    Examples:
        >>> class MyClass:
        ...     def __init__(self):
        ...         self.value = 42
        >>> obj = MyClass()
        >>> LOGGER.track_variable(lambda: obj.value, "answer")
        >>> LOGGER.update()
    """
    if not self._tracked_vars or not self._enable_csv_logging:
        return

    with self._lock:
        data = []
        vars_to_untrack = []

        for var_id, get_value in self._tracked_vars.items():
            try:
                value = get_value()
                data.append(str(value))
                # Reset error count on successful retrieval
                self._error_count[var_id] = 0
            except Exception as e:
                var_name = self._var_names.get(var_id, "unknown")
                self.warning(f"Error getting value for {var_name}: {e}")
                data.append("ERROR")

                # Increment error count and check if we should untrack
                self._error_count[var_id] = self._error_count.get(var_id, 0) + 1
                if self._error_count[var_id] >= self._max_errors_before_untrack:
                    vars_to_untrack.append((var_id, var_name))

        # Only add data if we have variables to track
        if data:
            self._buffer.append(data)

        # Untrack variables with too many errors
        for var_id, var_name in vars_to_untrack:
            self._tracked_vars.pop(var_id, None)
            self._var_names.pop(var_id, None)
            self._error_count.pop(var_id, None)
            self.warning(
                f"Auto-untracked variable {var_name} after {self._max_errors_before_untrack} consecutive errors"
            )

        if len(self._buffer) >= self._buffer_size:
            self.flush_buffer()

warning(msg, *args, **kwargs)

Log a warning message.

Ensures that the file handler is set up before logging.

Parameters:

Name Type Description Default
msg object

The message to log.

required
*args object

Additional arguments.

()
**kwargs Any

Additional keyword arguments.

{}
Source code in opensourceleg/logging/logger.py
def warning(self, msg: object, *args: object, **kwargs: Any) -> None:
    """
    Log a warning message.

    Ensures that the file handler is set up before logging.

    Args:
        msg (object): The message to log.
        *args (object): Additional arguments.
        **kwargs (Any): Additional keyword arguments.
    """
    self._ensure_file_handler()
    super().warning(msg, *args, **kwargs)