-
Notifications
You must be signed in to change notification settings - Fork 0
vtf textbuffer TextBuffer
class buffer.TextBuffer ¶
Buffer object for dynamic editing of text.
Functions, properties and behavior are modeled on those of a text editing program.
- Features and functions overview:
- Cursor navigation:
- ordinary movement.
- page up/down.
- pos1/end.
- jumping (like ctrl+arrow).
- go to data point/row/line x.
- ...
- Writing:
- writing characters / strings (text).
- four substitution modes.
- non-breaking line breaks. (Count in length like one character, but are represented with
""
in the rows.)- tab transformation.
- backspace / delete.
- ...
- Subsequent components:
- __local_history__:
- undo/redo support.
- branch fork support.
- __marker__:
- support for marking text areas (like shift+arrow).
- further processing of markings.
- __trimmer__[/__swap__]:
- management of the buffer size.
This main object is practically the handler for the component objects (_buffercomponents) that are created and processed in high dependence on the buffer and each other.
Some features and functions can be installed after the initialization of the main object via methods with
init_
prefix.Subsequent installed or standard components of the buffer are assigned to the following attributes:
__swap__
_Swap Optional Component assigned by an init_*
method__trimmer__
_Trimmer Optional Component assigned by an init_*
method__local_history__
_LocalHistory Optional Component assigned by an init_*
method__marker__
_Marker Optional Component assigned by an init_*
method__glob_cursor__
_GlobCursor Permanent Component When editing the buffer, it must be noted that editing the buffer always results in an adjustment of the other data. Therefore and because of the conversion of data in Python types, the buffer is not suitable for "Big Data" and shows weaknesses with extremely long lines when a highlighter is used in the display. Nevertheless, the processing of even a larger number of rows is possible with the help of a swap.
For this the attributes
- __start_point_data__
- __start_point_content__
- __start_point_row_num__
- __start_point_line_num__
have meaning and indicate the current start of the data in the buffer. The attributes
- __n_rows__
- __n_newlines__
indicate the number of lines and rows in the current buffer (a line, compared to a row, defines the data between line breaks).
The properties
- __eof_data__
- __eof_content__
- __eof_row_num__
- __eof_line_num__
calculate the endpoints of all data in the buffer and swap.
- Parameterization:
- top_row_vis_maxsize
- Set a maximum width of the visual representation of row data for the first row.
- future_row_vis_maxsize
- Set a maximum width of visual representation of row data for the remaining rows.
- tab_size
- The visual size of a tab in characters.
- tab_to_blank
- Replace the relative tab range directly when writing to blanks.
- autowrap_points
- Specify a definition of characters via a
re.Pattern
at which a row should be wrapped when reaching the maximum width; then wrapping is done at the end of there.Match
. IfTrue
is passed a predefined pattern is used,False
disables the function.
- jump_points_re
- Jump points for a special cursor movement defined as
re.Pattern
. Applied when the cursor is moved forward in jump mode, the cursor is then moved to the end point of there.Match
or to the end of the row. IfNone
is passed, a predefined pattern is used.
- back_jump_re
- Jump points for a special cursor movement defined as
re.Pattern
. Applied when the cursor is moved backward in jump mode, the cursor is then moved to the start point of there.Match
or to the beginning of the row. IfNone
is passed, a predefined pattern is used.ChunkBuffer: Type[ChunkBuffer]
ChunkIter: Type[ChunkIter]
DEFAULT_AUTOWRAP_RE: Pattern = compile('[\s()\[\]{}\\/&_-](?=\w*$)')
Used when autowrap_points is parameterized withTrue
.DEFAULT_BACK_JUMP_RE: Pattern = compile('(?<=\w)(?=\W)|(?<=\W)(?=\w)')
Used when jump_points_re is parameterized withNone
.DEFAULT_JUMP_POINTS_RE: Pattern = compile('(?<=\w)(?=\W)|(?<=\W)(?=\w)')
Used when back_jump_re is parameterized withNone
.__cursor_translation_cache_sizes__: tuple[int, int, int, int, int, int]
Evaluated by the _Row _CursorCache to determine the maximum sizes of method-return-values-memories.
- slot 0: tool_vis_to_cnt_excl (default size: 4)
- slot 1: tool_seg_in_seg_to_vis (default size: 8)
- slot 2: tool_cnt_to_seg_in_seg (default size: 16)
- slot 3: tool_cnt_to_vis (default size: 32)
- slot 4: tool_vis_to_cnt (default size: 32)
- slot 5: content_limit (default size: 1)
__display__: _DisplayBase
__glob_cursor__: _GlobCursor
__local_history__: _LocalHistory
__marker__: _Marker
__n_newlines__: int
__n_rows__: int
__start_point_content__: int
__start_point_data__: int
__start_point_line_num__: int
__start_point_row_num__: int
__swap__: _Swap
__trimmer__: _Trimmer
current_row_idx: int
rows: list[_Row]
@property__eof_content__() -> int ¶@property__eof_data__() -> int ¶@property__eof_line_num__() -> int ¶@property__eof_metas__() -> tuple[int, int, int, int] ¶@property__eof_row_num__() -> int ¶The row where the cursor is located.
raises:
- IndexError: Possible when called during an edit or after an external edit (after an edit a buffer indexing and cursor navigation should always be done).
backspace() -> tuple[WriteItem, ChunkLoad] | None ¶
Delete the character to the left of the cursor; return
None
if there is none, ( WriteItem, ChunkLoad ) otherwise.
- Relevant ChunkLoad Fields:
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload | __highlighter__.prep_by_write [+] __marker__.conflict [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __glob_cursor__.note
raises:
- AssertionError: __local_history__ lock is engaged.
cursor_move(*, z_row=None, z_column=None, jump=False, mark_jump=False, border=False, as_far=False, mark=False, cross=True) -> ChunkLoad | None ¶
Move the cursor and return ChunkLoad if the cursor was moved, otherwise
None
.
- Relevant ChunkLoad Fields:
- spec_position (possible with mark_jump or __marker__.backjump_mode)
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __glob_cursor__.note
- Parameters:
- z_row
- This summand is added to the current row position.
- z_column
- This summand is added to the current column position. Or direction hint for jump, mark_jump and border.
- jump
- Jump to a predefined location.
- mark_jump
- Jump to the next boundary point of a marking.
- border
- Jump to the beginning or end of a row.
- as_far
- In case of errors, as far as possible.
- mark
- Expand / create a marker.
- cross
- Allow the crossing of the row borders.
raises:
- CursorError: following are only possible if mark_jump used or mark-back-jump triggered and the datapoint is not reachable.
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are not available.
- CursorChunkMetaError: Chunks of the required side could not be loaded sufficiently. The closest chunk was loaded and the cursor was placed at the beginning of the first row.
- CursorPlacingError: if an error occurs during the final setting of the cursor (indicator of too high value). The cursor was set to the next possible position.
- CursorNegativeIndexingError: when the final value is negative.
- AssertionError: mark_jump used or mark-back-jump triggered and __local_history__ lock is engaged.
cursor_new(z_row=None, z_column=None, jump=False, mark_jump=False, border=False, as_far=False) -> tuple[int | None, int | None] | int | None ¶
Calculate new cursor coordinates and return a positive result as
(<new row>, <new column>)
or<boundary point of a marking>
, otherwiseNone
.
- Parameters:
- z_row
- This summand is added to the current row position.
- z_column
- This summand is added to the current column position. Or direction hint for jump, mark_jump and border.
- jump
- Jump to a predefined location.
- mark_jump
- Jump to the next boundary point of a marking.
- border
- Jump to the beginning or end of a row.
- as_far
- in case of errors, as far as possible.
cursor_set(n_row_index, n_column, as_far=False) -> bool ¶
Set the cursor to the n_column in n_row_index; in case of errors as_far as possible. Return whether the cursor was set.delete() -> tuple[WriteItem, ChunkLoad] | None ¶
Delete the character to the right of the cursor; return
None
if there is none, ( WriteItem, ChunkLoad ) otherwise.
- Relevant ChunkLoad Fields:
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload | __highlighter__.prep_by_write [+] __marker__.conflict [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __glob_cursor__.note
raises:
- AssertionError: __local_history__ lock is engaged.
export_bufferdb(dst, *, invoke_marker=True, invoke_history=True, invoke_cursor_anchors=True) -> None ¶
Export a backup database in sqlite 3 format; invoke_* component data. The destination database can be defined by an ordinary path, a URI or a SQL connection.
raises:
- ValueError: if ":history:" , ":swap:" or ":memory:" is used as dst.
- DatabaseError: unspecific sql database error.
- DatabaseFilesError: dst already exists.
- DatabaseTableError: if the database tables already exist in the destination.
find(regex, end=False, *, all=False, reverse=False) -> list[tuple[_Row, Match]] ¶
Find the regular expression, starting from the current row, before the cursor or find reverse.
If end is
""
,"\n"
orNone
, theend
of a _Row is part of the condition and must be exactly the same, if end isTrue
the end of a row must be""
or"\n"
,False
skips the comparison.If all is
True
, the search is started from the top and all matches are returned.The return value for matches is: ( _Row,
re.Match
)goto_chunk(position_id, autofill=False) -> ChunkLoad ¶
Dump the currently loaded chunks then load a specific chunk. Record the current position in
__local_history__
beforehand.
- Relevant ChunkLoad Fields:
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __glob_cursor__.note
raises:
- CursorChunkLoadError: if the position is not available.
Go to data point n. Record the current position in
__local_history__
beforehand.
- Relevant ChunkLoad Fields:
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __glob_cursor__.note
raises:
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are not available.
- CursorChunkMetaError: Chunks of the required side could not be loaded sufficiently. The closest chunk was loaded and the cursor was placed at the beginning of the first row.
- CursorPlacingError: if an error occurs during the final setting of the cursor (indicator of too high value). The cursor was set to the next possible position.
- CursorNegativeIndexingError: when a negative value is passed.
- AssertionError: __local_history__ lock is engaged.
goto_line(__n=0, *, to_bottom=False, as_far=False) -> ChunkLoad ¶
Go to the beginning of the line with number n, as far as possible instead of raising the
CursorChunkLoadError
or to the last one. Record the current position in__local_history__
beforehand.
- Relevant ChunkLoad Fields:
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __glob_cursor__.note
raises:
- AssertionError: __local_history__ lock is engaged.
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are available.
- CursorNegativeIndexingError: when a negative value is passed and as_far is False.
goto_row(__n=0, *, to_bottom=False, as_far=False) -> ChunkLoad ¶
Go to the beginning of the row with number n, as far as possible instead of raising the
CursorChunkLoadError
or to the last one. Record the current position in__local_history__
beforehand.
- Relevant ChunkLoad Fields:
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __glob_cursor__.note
raises:
- AssertionError: __local_history__ lock is engaged.
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are available.
- CursorNegativeIndexingError: when a negative value is passed and as_far is False.
import_bufferdb(src, *, init=False, warnings=False, errors=True, critical=True) -> None ¶
Import a sqlite 3 backup database.
- src
- The source database can be defined by an ordinary path, a URI or a SQL connection.
- init
- Reinitialize the buffer and components if the buffer is not in the initial state.
- warnings
- Raise warnings (if a component is present in the buffer but corresponding data is missing in the database).
- errors
- Raise errors (if a component is not present in the buffer but corresponding data is in the database).
- critical
- Raise critical errors (when a swap is not present in the buffer but corresponding data is in the database). Reads the first chunk from above into the buffer, when ignored. [ ! ] Makes the buffer unstable.
raises:
- DatabaseError: unspecific sql database error.
- DatabaseFilesError: src not exists.
- CursorError: following are possible if the datapoint of the cursor from the database is not reachable.
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are not available.
- CursorChunkMetaError: Chunks of the required side could not be loaded sufficiently. The closest chunk was loaded and the cursor was placed at the beginning of the first row.
- CursorPlacingError: if an error occurs during the final setting of the cursor (indicator of too high value). The cursor was set to the next possible position.
- CursorNegativeIndexingError: when a negative value is passed.
indexing(start_idx=0) -> tuple[int, int, int, int] ¶
Index the current buffer and return the index-start-data for the next chunk.
(abs_dat, abs_cnt, row_num, lin_num)
init_localhistory(maximal_items, items_chunk_size, maximal_items_action, undo_lock, branch_forks, db_path, unlink_atexit) -> TextBuffer ¶
Initialize local history ( _LocalHistory ).
Overview of some feature interfaces:
>>> TextBuffer.__local_history__.undo() >>> TextBuffer.__local_history__.redo() >>> TextBuffer.__local_history__.lock_release() >>> TextBuffer.__local_history__.branch_fork()Parameterization:
- db_path
The location of the database can be specified as a filepath using an ordinal or "Uniform Resource Identifier" (URI) string; to create the database temporarily in RAM, the expression
":memory:"
can be used; another special expression is":swap:"
to create the database in the same location of the database in _Swap.
- unlink_atexit
Registers the deletion of the database when exiting the Python interpreter.
- The process is performed depending on the location of the database:
- If the database was created as simple files, the connection is closed and the files are deleted from disk (including journals);
- If an existing connection was passed (internal usage) or the database was created in RAM, the connection will be closed;
- If the expression
":swap:"
was used during creation, all LocalHistory entries in the database are deleted (unless the database was closed before).
- undo_lock
Enables the undo lock feature. Blocks processing of the buffer immediately after an undo action until the lock is released.
- branch_forks
Enables the chronological forks feature. Allows to switch between the last undo branch.
- maximal_items
Sets an upper limit for chronological items.
None
corresponds to no limit. The final value is composed of maximum_items + items_chunk_size.
- items_chunk_size
Defines the amount of chronological items that will be removed when the upper limit is reached. The final value of the upper limit is composed of maximum_items + items_chunk_size.
- maximal_items_action
Executed before chronological items are removed when the upper limit is reached. Does not receive any parameters.
raises:
- ConfigurationError: LocalHistory is not compatible with Trimmer's drop-morph or if ":swap:" is used and there is no connection to a database in __swap__.
- DatabaseFilesError: db_path already exists.
- DatabaseTableError: if the database tables already exist in the destination.
init_marker(multy_marks, backjump_mode) -> TextBuffer ¶
Initialize markers ( _Marker ).
Overview of some feature interfaces:
>>> TextBuffer.cursor_move(mark=, mark_jump=) >>> TextBuffer.__marker__.add_marks() >>> TextBuffer.__marker__.pop_aimed_mark() >>> TextBuffer.__marker__.reader() >>> TextBuffer.__marker__.marked_shift() >>> TextBuffer.__marker__.marked_tab_replace() >>> TextBuffer.__marker__.marked_remove()init_rowmax__drop(rows_maximal, chunk_size, keep_top_row_size, action) -> TextBuffer ¶
Set the upper limit of rows in the current buffer and drop the cut chunks into action ( _Trimmer ).
- ConfigurationError: Trimmer already initialized or LocalHistory is not compatible with Trimmer's drop-morph.
- ValueError: The quotient of rows_maximal - 1 and chunk_size - 1 is around 0.
init_rowmax__restrict(rows_maximal, last_row_maxsize) -> TextBuffer ¶
Limit the size of the buffer so that no further entries remain ( _Trimmer ).
raises:
- ConfigurationError: Trimmer already initialized.
init_rowmax__swap(rows_maximal, chunk_size, load_distance, keep_top_row_size, db_path, unlink_atexit) -> TextBuffer ¶
Set the upper limit of rows in the current buffer and initialize a swap for the cut chunks ( _Trimmer <-> _Swap ).
Parameterization:
- db_path
The location of the database can be specified as a filepath using an ordinal or "Uniform Resource Identifier" (URI) string; to create the database temporarily in RAM, the expression
":memory:"
can be used; another special expression is":history:"
to create the database in the same location of the database in _LocalHistory.
- from_db
To build the database and the object from an existing database, the origin can be passed as an ordinal path, a URI, or an existing SQL connection. The connection will be closed automatically afterwards, unless an SQL connection was passed.
- unlink_atexit
Registers the deletion of the database when exiting the Python interpreter.
- The process is performed depending on the location of the database:
- If the database was created as simple files, the connection is closed and the files are deleted from disk (including journals);
- If an existing connection was passed (internal usage) or the database was created in RAM, the connection will be closed;
- If the expression
":history:"
was used during creation, all Swap entries in the database are deleted (unless the database was closed before).
- rows_maximal
Used as the limit for fillings.
- keep_top_row_size
After loading chunks, adjusts the top row in the buffer to the allocated size of the "top row".
- load_distance
Distance between the cursor and the edge of the currently loaded chunks in the buffer at which loading is triggered.
raises:
- ValueError: The quotient of rows_maximal - 1 and chunk_size - 1 is around 0.
- ConfigurationError: Trimmer already initialized or ":history:" is used and there is no connection to a database in __local_history__.
- DatabaseFilesError: db_path already exists.
- DatabaseTableError: if the database tables already exist in the destination.
reader(*, bin_mode=False, endings=None, tabs_to_blanks=False, replace_tabs=None, progress=0, dat_ranges=None) -> Reader ¶
Factory method for Reader.reinitialize() -> None ¶
Reinitialize the buffer.
[+] __local_history__ [+] __swap__ [+] __marker__ [+] __highlighter__.prep_by_none
remove(coords, coord_type) -> tuple[list[tuple[int, list[tuple[list[str], str | Literal[False] | None]]]], ChunkLoad] ¶
Remove data from the buffer and return it. Adjust the cursor accordingly.
The data coordinates are defined as a list of data points
[ <int>, ... ]
or as a list of data ranges[ [<int>, <int>], ... ]
(both must be sorted). The data type is specified by coord_type;
- Possible values:
"pointing data"
: Remove the whole rows that match the data coordinates."data"
: Remove data ranges, coord must be defined as a list of ranges for this type."row"
: Remove entire rows, coord must be formulated with the row numbers for this."line"
: Remove entire lines, coord must be formulated with the line numbers for this (compared to a row, a line is defined as the data between two line breaks + final line break).The return value is composed as follows:
( [ ( coord start: int, removed rows: [ ( row raster: [str], row end: "" | "\n" | None | False ), ... ] ), ... ], final chunk load item)
At index 1 of the tuple is the ChunkLoad.
At index 0 of the tuple there is a list of items of the removed data:
An item is composed of the starting point of the coordinate at index 0 and a list of row data at index 1:
Row data is a tuple of the remoted content data (tab-separated string of printable characters) at index 0 and the remoted row end (can be
"\n"
for a line break,""
for a non-breaking line break,None
if the row has no line break, orFalse
as a non-removed end) at index 1.
- Relevant ChunkLoad Fields:
- edited_ran
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __highlighter__.prep_by_chunkload
raises:
- AssertionError: __local_history__ lock is engaged.
overload resize(*, size_top_row=..., size_future_row=..., trimmer__rows_maximal=..., trimmer__chunk_size=...) -> ChunkLoad ¶
overload resize(*, size_top_row=..., size_future_row=..., trimmer__rows_maximal=..., trimmer__last_row_maxsize=...) -> ChunkLoad ¶
resize(**kwargs) -> ChunkLoad ¶
Change the parameterization of the maximum lengths of the rows and perform an adjustment.
- Relevant ChunkLoad Fields:
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __highlighter__.prep_by_chunkload [+] __marker__.adjust [+] __glob_cursor__.adjust
rowwork(coords, coord_type, worker, goto, unique_rows=False) -> tuple[list[tuple[list[int, int] | int, list[WriteItem | None]]], ChunkLoad] | None ¶
[ ADVANCED USAGE ]
The method allows editing of rows by worker in an iteration with corresponding adjustment of components, the display and the metadata. For this, worker must always return a WriteItem or
None
if the row was not edited. WARNING: WriteItem.Overflow is NOT handled.worker receives in the iteration the _Row and the corresponding coordinate, which is ORIENTED to the input and originates from ChunkIter.ParsedCoords.
The iteration mode is
"coords reversed + s"
, the iteration runs backwards through the original coordinates and forwards through the rows of the respective coordinate. See also ChunkIter.Finally, the cursor position is recalled via goto.
The data coordinates are defined as a list of data points
[ <int>, ... ]
or as a list of data ranges[ [<int>, <int>], ... ]
(both must be sorted). The data type is specified by coord_type;
- Possible values:
"data"
: Determine the rows using data coordinates, coord must be formulated with the data points for this."content"
: Determine the rows using content coordinates, coord must be formulated with the content points for this."row"
: Determine the rows by row numbers, coord must be formulated with the row numbers for this."line"
: Determine the rows by line numbers, coord must be formulated with the line numbers for this (compared to a row, a line is defined as the data between two line breaks + final line break).If coords is
None
, then iterate through the entirety of the data and ignore coord_type.If unique_rows is
True
, each row is processed only once, even if multiple coordinates apply to one.The return value is composed as follows:
( [ ( coordinates: list[int, int] | int, write items: [ WriteItem | None, ... ] ), ... ], ChunkLoad ) | None
At index 1 of the tuple is the ChunkLoad.
At index 0 of the tuple there is a list of (coordinate to WriteItem's) pairs:
An pair is composed of the coordinate at index 0 and a list of
WriteItem
's |None
at index 1:The list of
WriteItem
's corresponds to the rows in the coordinate, an entry isNone
if editing has not taken place in a row.The total return value can be
None
if nothing was edited.[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __highlighter__.prep_by_chunkload
raises:
- AssertionError: __local_history__ lock is engaged.
- CursorError: following are possible if the datapoint from goto is not reachable.
- CursorChunkLoadError: if n is not in the range of the currently loaded chunks and the chunks of the required side cannot be loaded completely/are not available.
- CursorChunkMetaError: Chunks of the required side could not be loaded sufficiently. The closest chunk was loaded and the cursor was placed at the beginning of the first row.
- CursorPlacingError: if an error occurs during the final setting of the cursor (indicator of too high value). The cursor was set to the next possible position.
- CursorNegativeIndexingError: when a negative value is passed.
shift_rows(coords, coord_type, *, backshift=False, unique_rows=True) -> tuple[list[tuple[list[int, int] | int, list[WriteItem | None]]], ChunkLoad] | None ¶
Shift rows.
origin: "\t foo bar" " foo bar" shifted origin (`tab-to-blanks-mode` not configured): "\t\t foo bar" "\t foo bar" backshifted origin: " foo bar" "foo bar"The data coordinates are defined as a list of data points
[ <int>, ... ]
or as a list of data ranges[ [<int>, <int>], ... ]
(both must be sorted). The data type is specified by coord_type;
- Possible values:
"pointing data"
: Determine the rows using data coordinates."content"
: Determine the rows using content coordinates."row"
: Determine the rows by row numbers, coord must be formulated with the row numbers for this."line"
: Determine the rows by line numbers, coord must be formulated with the line numbers for this (compared to a row, a line is defined as the data between two line breaks + final line break).If coords is
None
, then shift the entirety of the rows and ignore coord_type.If unique_rows is
True
(default), each row is processed only once, even if multiple coordinates apply to one.The return value is composed as follows:
( [ ( coordinates: list[int, int] | int, write items: [ WriteItem | None, ... ] ), ... ], ChunkLoad ) | None
At index 1 of the tuple is the ChunkLoad.
At index 0 of the tuple there is a reversed list of (coordinate to WriteItem's) pairs:
An pair is composed of the coordinate at index 0 and a list of
WriteItem
's |None
at index 1:The list of
WriteItem
's corresponds to the rows in the coordinate, an entry isNone
if editing has not taken place in a row.The total return value can be
None
if nothing was edited.
- Relevant ChunkLoad Fields:
- edited_ran
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __highlighter__.prep_by_chunkload
raises:
- AssertionError: __local_history__ lock is engaged.
tab_replace(coords, coord_type, *, to_char=" ") -> tuple[list[tuple[list[int, int] | int, list[WriteItem | None]]], ChunkLoad] | None ¶
Replace tab spaces in coords (of type coord_type) to_char. Adjust the cursor accordingly.
The data coordinates are defined as a list of data points
[ <int>, ... ]
or as a list of data ranges[ [<int>, <int>], ... ]
(both must be sorted). The data type is specified by coord_type;
- Possible values:
"pointing data"
: Replace tab spaces of the whole rows that match the data coordinates."data"
: Replace tab spaces in data ranges, coord must be defined as a list of ranges for this type."row"
: Replace tab spaces in entire rows, coord must be formulated with the row numbers for this."line"
: Replace tab spaces in entire lines, coord must be formulated with the line numbers for this (compared to a row, a line is defined as the data between two line breaks + final line break).If coords is
None
, then replace tab spaces of the entirety of the data and ignore coord_type.The return value is composed as follows:
( [ ( coordinates: list[int, int] | int, write items: [ WriteItem | None, ... ] ), ... ], ChunkLoad ) | None
At index 1 of the tuple is the ChunkLoad.
At index 0 of the tuple there is a list of (coordinate to WriteItem's) pairs:
An pair is composed of the coordinate at index 0 and a list of
WriteItem
's |None
at index 1:The list of
WriteItem
's corresponds to the rows in the coordinate, an entry isNone
if editing has not taken place in a row.The total return value can be
None
if nothing was edited.
- Relevant ChunkLoad Fields:
- edited_ran
- spec_position
- top_nload
- btm_nload
- top_cut
- btm_cut
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __swap__.fill [+] __trimmer__.trim [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __highlighter__.prep_by_chunkload
raises:
- AssertionError: __local_history__ lock is engaged.
write(string, *, sub_chars=False, force_sub_chars=False, sub_line=False, associate_lines=False, nbnl=False, move_cursor=True) -> tuple[WriteItem, ChunkLoad] ¶
Write on the position of the cursor [ ! ] CR ( "\r" ) is not allowed.
Write in substitute_chars mode to replace characters associatively to the input in the row, at most up to the next tab (only used if neither a newline nor a tab is present in the string); OR
don't care about tabs in the input and apply the substitution also to tabs when the mode forcible_substitute_chars is active; OR
substitute the entire row(s) until the next linebreak from the cursor position in mode substitute_line; OR
replace rows until the next line break associative to the number of entered lines;
and replace line breaks with non-breaking line breaks when nbnl (n\ on-b\ reaking-n\ ew-l\ ine) is set to
True
.Finally, move_the_cursor relative to the input.
[+] __local_history__ [+] __local_history__.lock [+] __swap__.adjust [+] __trimmer__.trim [+] __highlighter__.prep_by_write [+] __marker__.conflict [+] __marker__.adjust [+] __glob_cursor__.adjust [+] __glob_cursor__.note
raises:
- AssertionError: __local_history__ lock is engaged.
Date: | 21 Dec 2022 |
---|---|
Version: | 0.1 |
Author: | Adrian Hoefflin [srccircumflex] |
Doc-Generator: | "pyiStructure-RSTGenerator" <prototype> |